Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

a 5700XT is around 9-12% faster than a non XT

Around 15% generally, however close enough.

as an example here is DF analysis of Death Stranding

In death stranding yes. The discussion was around A Plague Tale, numbers will be different for other games in special ports. I have mentioned before theres a reason why reviewers generally do not go by ports but multiplatform titles to keep things more accurate.

the 5700 having 75% of the PS5 performance which is close to a Plague Tale Requiem of the Same GPU having 78% of the PS5 performance

while the 5700XT having 83% of the PS5 performance

so i am expecting the same gap in Performance in this Title as well

PS5 is performing somewhat below 6600XT (or slightly below 2070S) which does put it around 10% above a RX5700XT which is according to spec as the 5700XT is a 9.7TF GPU at a slightly older architecture.

as Alex mentioned in the Video AMD GPU's doesn't perform that well on a Plague Tale Requiem compared to Nvidia as the 2070Super slightly outperforms the PS5 in this title while it under performs in Death Stranding despite the PS5 having the Same Performance advantage against it's RDNA1 brethren "Driver perhaps?"

It seems that thats true for the high end AMD/NV range indeed to some small extend, the 3080 should be faster, maybe not that much faster, but its kinda close, close enough as what to expect. Lower down the range it seems AMD and NV gpus are performing more close to what they should be doing, ie the playing field of the consoles (10TF gpus).
Death Stranding is a port not a multiplatform game. Games natively designed for a specific platform that later get ported usually favor the native platforms (in this case consoles) design. The 2070S is indeed underperforming there. It isnt in this benchmark though, neither are the PS5, 5700XT etc.
 
I think that one was really useful, and some games ended up shipping with it, but since AMD did not support it a number of devs just never ended up adding it into their engine to boost DX11 performance on the CPU. I know particularly of one title from days of yore where a dev told me they had deferred contexts set up, but since AMD did not support, they did not ship to the detriment of NV user's CPU performance: "to keep things even".

Trying to remember the other presentations from way back I remember where different devs came to a similar conclusion. Even on NV hardware. Could be down to implementation of course.
 
I think that one was really useful, and some games ended up shipping with it, but since AMD did not support it a number of devs just never ended up adding it into their engine to boost DX11 performance on the CPU. I know particularly of one title from days of yore where a dev told me they had deferred contexts set up, but since AMD did not support, they did not ship to the detriment of NV user's CPU performance: "to keep things even".
A question for Alex here. Due to a plague tale being very limited in GPU here, is it correct to say that an implementation of dynamic res(say, 1440p to 1080p) would be most ideal for a situation like this, or a straight resolution drop instead?
 
First experiences with Miles Morales PC are coming out, and perhaps not surprisingly it's just as, if not more CPU limited than Spiderman.

I suspect the added ray traced shadows are putting further demand on the CPU, and I'm sure a Zen4/Raptor Lake i9 would fare better, but the 4090 is often at ~50% utilization here at 4K/maxxed, no DLSS. It can be in the 40 fps range when swinging through the city. On the upside not like you'll be losing much with Ampere. :)

1668801138404.png


Edit: Bang4Buck video with a 13900k at 5.8ghz. Much better GPU utilization, albeit swinging through the city will still drop down in the 60 fps range, but generally 70-80fps with 95%+ GPU. So this game is a beast maxxed on both the GPU and CPU - at least, on certain threads.

1668813148826.png

Regardless Alex is finishing up his video so we'll have a more accurate picture of where the bottlenecks are relatively soon.

There will also be an NXGamer video.

 
Last edited:
Another walking advertisement for Frame Generation and how it "brute forces" its way through CPU bottlenecks (somehow became a more tedious problem in a very, very recent times, indeed. Looking at Gotham Knights, I wouldn't be surprised if a DLSS 3 patch is on its way for that game too).
 
Another walking advertisement for Frame Generation and how it "brute forces" its way through CPU bottlenecks (somehow became a more tedious problem in a very, very recent times, indeed. Looking at Gotham Knights, I wouldn't be surprised if a DLSS 3 patch is on its way for that game too).

Gotham Knights is hardly an advertisement for frame generation, it's fundamentally a broken game. It's needs a ton of engine improvements long before DLSS3, which a tiny fraction of gamers could even take advantage of atm.
 
Hopefully amds solution is universal altho I wonder how that could possibly be without AI. Is faking frames really possible otherwise?
 
First experiences with Miles Morales PC are coming out, and perhaps not surprisingly it's just as, if not more CPU limited than Spiderman.

I suspect the added ray traced shadows are putting further demand on the CPU, and I'm sure a Zen4/Raptor Lake i9 would fare better, but the 4090 is often at ~50% utilization here at 4K/maxxed, no DLSS. It can be in the 40 fps range when swinging through the city. On the upside not like you'll be losing much with Ampere. :)

View attachment 7556


Edit: Bang4Buck video with a 13900k at 5.8ghz. Much better GPU utilization, albeit swinging through the city will still drop down in the 60 fps range, but generally 70-80fps with 95%+ GPU. So this game is a beast maxxed on both the GPU and CPU - at least, on certain threads.

View attachment 7557

Regardless Alex is finishing up his video so we'll have a more accurate picture of where the bottlenecks are relatively soon.

There will also be an NXGamer video.

But people told me this port was amazing!! A 4090+5800X3D dropping to the 40's on an enhanced PS4 game with RT is lolworthy. Terrible port performance-wise.
 
But people told me this port was amazing!! A 4090+5800X3D dropping to the 40's on an enhanced PS4 game with RT is lolworthy. Terrible port performance-wise.
We need to see performance at PS5 settings to see a more useful CPU comparison. But ya PC CPU performance has been trending rather poorly relative to the consoles. DXR API or bad coding to blame?


Some downgrades compared to PS5. Seems like this one came in pretty hot to make holiday season. Those shadows seem to be murdering performance. At 4K DLSS-P a 3080 is under 40 fps in several scenes.
 
Last edited:
  • Like
Reactions: snc
Saw this video, for Uncharted

the 1080p performance the RTX3070 is obviously bottlenecked with the 3700X but at 4K? this is where the picture gets clearer

then remembered when Drake Falls in the Water @ 29:15 of the video that's where Performance on the PS5 drops into the 30s then got a straight comparison while the RTX3070 remains the 40s (20 - 25% faster)

1668846746742.png 1668846800156.png

 
We need to see performance at PS5 settings to see a more useful CPU comparison. But ya PC CPU performance has been trending rather poorly relative to the consoles. DXR API or bad coding to blame?


Some downgrades compared to PS5. Seems like this one came in pretty hot to make holiday season. Those shadows seem to be murdering performance. At 4K DLSS-P a 3080 is under 40 fps in several scenes.

looking at Nvidia own Benchmark for this Game

the Gap Between RTX3090TI and RTX3080 is 40% (which is the largest i've seen between the Two) while the Gap Between the RTX3060 and RTX3070 is 28% which means VRAM is the bottleneck, 8GB & 10GB not enough for Max Settings in this title due to the Added RT shadows i believe



1668847703414.png
 
This game also has ugly ass shit reflections on PC like the other game does.

They look dreadful and if they're genuinely that poor on PC I'm not using RT.
 

Attachments

  • What.jpg
    What.jpg
    644.4 KB · Views: 26
The PS5 in RT performance Mode Drops below 60 in this game when swinging
It uses settings set to below very Low to Help CPU performance
The port is fine
On a 9900K, it was almost impossible to lock to 60fps even at low settings no matter what one did. It would always have enormous performance lurches.


Here we have it at Very High 1440p/DLSS Quality (which is a 960p base resolution) getting utterly ruined and dropping to the 50's and even high 40's on a 3080/9900K setup. High RT fares better but still fails to lock to 60fps. And even without RT at all, it stays at 60fps or above by a razor-thin margin. On top of that, until recently, a myriad of users (including me) were reporting better performance with HT/SMT set to off to the point some were even recommending disabling it.

It uses settings set to below very Low to Help CPU performance
The port is fine
The 4090 is several times more powerful than the PS5's GPU and the 5800X3D is far above the PS5's CPU. The fact that the 4090 drops to the 40's is not even comparable. Furthermore, what setting below Very Low are you referring to?
Spider-Man RT settings.PNG

I assume this is in reference to Miles Morales and not the original game.

That game with or without RT eats even beefy CPUs for breakfast even at PS5 settings whereas the console chugs along just fine.
 
On a 9900K, it was almost impossible to lock to 60fps even at low settings no matter what one did. It would always have enormous performance lurches.


Here we have it at Very High 1440p/DLSS Quality (which is a 960p base resolution) getting utterly ruined and dropping to the 50's and even high 40's on a 3080/9900K setup. High RT fares better but still fails to lock to 60fps. And even without RT at all, it stays at 60fps or above by a razor-thin margin. On top of that, until recently, a myriad of users (including me) were reporting better performance with HT/SMT set to off to the point some were even recommending disabling it.


The 4090 is several times more powerful than the PS5's GPU and the 5800X3D is far above the PS5's CPU. The fact that the 4090 drops to the 40's is not even comparable. Furthermore, what setting below Very Low are you referring to?

I assume this is in reference to Miles Morales and not the original game.

That game with or without RT eats even beefy CPUs for breakfast even at PS5 settings whereas the console chugs along just fine.
Crowd density on PS5 is below Very Low on PC - that OG Video had the settings on PC for crowd density not functuoning.

A Ryzen 5 3600 with RT on can get 60 FPS and above in tiems squared with my optimised settings. If your 9900K gets worse than that, I would be shocked.
 
Back
Top