Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]


PS5:
Fidelity: 2160p at 30fps
Performance: 1440p at 60fps
RT Performance: Dynamic 1440p at 60fps
92,76Gb

Series X:
Fidelity: 2160p at 30fps
Performance: 1440p at 60fps
RT Performance: Dynamic 1440p at 60fps
87,80Gb

Series S:
Fidelity: 1440p at 30fps
Performance: 1080p at 60fps
87,80Gb

PC:
105,14Gb

- PC does not have Nextgen patch. The version used is from 2015 with all settings in Ultra.
- PS5 version improves startup load time by removing part of the intro cinematic.
- All versions have the same textures.
- PS5 has a clear performance advantage over Xbox Series S/X. Xbox consoles have an uneven framerate in demanding moments.
- The Xbox Series X and PS5 versions have better shadows and ambient occlusion in some cases than the PC version using Nvidia PCSS. These shadows seem to be generated by ray tracing, but I'm not sure if it's really this technology.
- Xbox Series S has some shading cutbacks and no RT Performance mode.
- The PC version still has a greater draw distance in assets, but less in shadows (even with the advanced long shadows mode).
- Console versions also add some extra lighting fixes and effects.
- PS5 may experience some graphical shadow bugs when changing display modes.
- Performance and Performance RT modes seem to use some sort of temporal reconstruction.
- Fidelity mode has better shadows, reflections, depth of field than the other two modes.
- Fidelity mode reflections on consoles are slightly higher quality than the PC version.
- The PC version has a higher density of vegetation.
- In general there is a very similar result to the PC version in Ultra (still quite demanding today).
- The improvements of this Nextgen version are comparable to those of Ghost of Tsushima or Death Stranding. These improvements are far from the ones they brought to PS4/ONE/PC with respect to PS3/360.

There is quite a big gap here. Up to 15-20 fps higher on PS5 when there are car explosions according to his video. Oddly he omitted in his summary the loading comparison half quicker on PS5 (30s vs 1mn).

That makes the third open world game (allegedly optimized for next-gen) in a row where PS5 is having an obvious edge (Cyberpunk, Elden Ring and now GTA5).

And the XSS is as expected a very disappointing hardware. No RT mode (again) and in demanding areas the performance is even worse than XSX at an old-gen resolution.
 
Why’s the ps5 performing better in this one?



The patch on pc is enabling Ultra settings, generally.
As usual we can only speculate. Now it's getting more obvious it's not only about tools but also about hardware differences. And even ignoring I/O there are HW differences on the Graphical domain.

While XSX leads in compute and bandwidth, the lead is actually minimal (~18% for compute as we know now the PS5 rarely ever downclocks) and the bandwidth lead (25%) is impended by specific memory architecture constraints that should lower that difference when the CPU is stressed. The more the CPU is used (like in a open-world game!), the more XSX will have memory contention leading to a bigger loss of bandwith (than PS5). We know this from a similar memory architecture on a PC GPU. Some calculations were done showed that in a CPU heavy game (like open world game) both machines should have a similar bandwidth / Tfops ratio.

But on the other hand the PS5 HW advantages are more significant. ROPs are clocked 22% higher but most importantly while both have the same amount of color ROPs, PS5 has twice the amount of depth / stencil ROPs (that's ~140% faster depth/stencil ROPs). And finally I suspect the L1 cache (and on a lesser account L2) advantage on PS5 is also significant. I think both machines have the same amount of L1 cache which would mean PS5 could also have a substantial advantage here.
 
Last edited:
Why’s the ps5 performing better in this one?



The patch on pc is enabling Ultra settings, generally.
If they didn't do much work to redo a the pipeline to maximize series consoles, PS5 will be better utilized. This is one of the advantages of having a balanced GPU between compute and fixed function hardware.
The frame rates look fairly consistent so this could be the case. Stuttering FPS issues are often indicative of API issues that could be ironed out with a patch, but smooth lowering and increasing frame rate issues often points to a bottleneck in the GPU.

We'll need to wait for a deeper dive to know the culprit, DF may be able to offer up additional insight.
 
But on the other hand the PS5 HW advantages are more significant. ROPs are clocked 22% higher but most importantly while both have the same amount of color ROPs, PS5 has twice the amount of depth / stencil ROPs (that's ~140% faster depth/stencil ROPs). And finally I suspect the L1 cache (and on a lesser account L2) advantage on PS5 is also significant. I think both machines have the same amount of L1 cache which would mean PS5 could also have a substantial advantage here.
L1 cache is read only, L2 cache is where compute units will write back to. For Series consoles, L2 is increased to align with the increase in both memory and compute units.
There should be no caching advantages here for PS5 except for clockspeed.

And I recall Locuza and others silicon investigators have indicated that in order for PS5 to stay 'cheaper' than their GPU counterparts, downclocking is certainly happening.
 
L1 cache is read only, L2 cache is where compute units will write back to. For Series consoles, L2 is increased to align with the increase in both memory and compute units.
There should be no caching advantages here for PS5 except for clockspeed.
Not enough. XSX has only 25% more L2 cache than PS5 while having 44% more CUs (to feed). The gap might be even bigger for L1.
 
Not enough. XSX has only 25% more L2 cache than PS5 while having 44% more CUs (to feed). The gap might be even bigger for L1.
The L1 and L2 cache setups for PS5 are the same as the radeon 6000 series cards, which don't have disabled CUs. They are full 10 DCUs per shader array. It's only PS5 that has 9 DCUs to improve chip yield.
So comparing XSX and PS5 in this way is inaccurate unless are you comfortable debating that a PS5 will outperform a 6700XT because of larger L1 and L2 cache to CU ratios while being 4 CUs down.

XSX has 5MB L2 cache for 13DCUs which aligns very well for 4MB of L2 for 10DCUs. It has 30% more DCUs and accompanies 25% more L2 Cache to match, these are as aligned as they can be.

As for ROPs, ROPs still require bandwidth. So the ROP advantage by PS5 is muted because in any situations where the hardware is memory bound, XSX will outperform PS5. The only exception to those rules are depth and stencil where XSX has half the units and will be processing bound and not bandwidth bound.

Combining these elements together point to a picture as to why the consoles often, compete around each other in performance, at least for the first wave of games.
 
Last edited:
Ok, some intresting takes. The PS5 doing quite well nevertheless, if its going to deliver slightly better performance in the first 2 or 3 years, and after that about equal or somewhat less than XSX (if that happens), that means the XSX and PS5 are basically a match over the whole generation. Which is a good thing i think (evenly matched consoles instead of huge differneces like 6th gen).
 
  • Like
Reactions: snc
Ok, some intresting takes. The PS5 doing quite well nevertheless, if its going to deliver slightly better performance in the first 2 or 3 years, and after that about equal or somewhat less than XSX (if that happens), that means the XSX and PS5 are basically a match over the whole generation. Which is a good thing i think (evenly matched consoles instead of huge differneces like 6th gen).
without knowing the resolutions per frame when the frame rates dips, it's hard to know what's happening with xbox here. Once you hit a single bottleneck the entire performance collapses and it can collapse hard. I would like to point at a particular issue here for series editions, but we have so little information.

It's ultimately up to developers to decide what they want to do here. PS5 is the lead console, so they'll likely continue to optimize the game for PS5's hardware, which in turn is the same as PC architecture wise. It's only Series consoles that are the odd ones out.
 
Last edited:
I want to see the DF deep tech dive video on this, also taking the PC version into account at Ultra settings and maybe even mods. With the pc version Alex could do GPU comparisons, maybe we can learn something from that. Anyway, still great game even though its from the PS360 generation.
 
without knowing the resolutions per frame when the frame rates dips, it's hard to know what's happening with xbox here. Once you hit a single bottleneck the entire performance collapses and it can collapse hard. I would like to point at a particular issue here for series editions, but we have so little information.

It's ultimately up to developers to decide what they want to do here. PS5 is the lead console, so they'll likely continue to optimize the game for PS5's hardware, which in turn is the same as PC architecture wise. It's only Series consoles that are the odd ones out.


Martin Fuller (i think) from msft advance technology group talked about this issue in his presentataion about VRS. He said that they observred that in titles where xsx drops framareta GPU is heavily underutilised
it would be nice to know what is the reason for that, where are they hitting bottlenecks and what are they doing to mitigate this (if its even possible).
 
Martin Fuller (i think) from msft advance technology group talked about this issue in his presentataion about VRS. He said that they observred that in titles where xsx drops framareta GPU is heavily underutilised
it would be nice to know what is the reason for that, where are they hitting bottlenecks and what are they doing to mitigate this (if its even possible).
Do you have a link by any chance to that video? He's made several VRS videos now, so I'm a bit perplexed which one. I would be interested in reading that transcript.
 
On modern games it's interesting to debate rops vs bandwidth vs compute or whatever -- but on old games that are ported up, ps5's much faster clock speed is an incredibly obvious advantage. If the game isn't utilizing your gpu very well (I doubt a fundamentally 10 year old game is for either console) clock speed ought to smooth things over quite a bit.
 
Lol this game. I was celebrating last month it didn’t make it to the charts.
GTA5 is basically as old today as the North American release of Super Mario Bros was when Tekken for Playstation launched in that region. It's crazy to think that there are kids who grew into adults during the lifetime of this game, and it hasn't stopped yet.

What loading? First launch from the dashboard? That's just a logo-skipping feature built in to the PS5. It loses on nearly ever other loading test.

This is another game that loads slower on Series S than it does on Series X, though. Which doesn't make sense if it's loading in lower fidelity assets, or less assets. Which it should be doing because it has less RAM.
 
30 seconds loads on PS5 (shorter in-game) does not mean that GTA V isn't using the I/O of the current generation of consoles. As discovered last year, there are things that GTA V does which are not related to I/O or drive speed that result in slow loading.

Or even CPU power just old and bad code. People need to understand this nearly no games have perfect code. There are always possibility of optimisation. Some are better than other but here it is not only a problem on current gen consoles but on last gen consoles and PC too. Maybe in the future they can improve the code with better multithreading and load even faster on GTA 6. Don't forget GTA 5 had to run on PS3 PPU...

Same Lance Mc Donald find some easy way to fix frame pacing issue in Soul's game and he said a few days ago Elden ring PS4 solve the issue but if PS5 is better than DX12 or Xbox, this is not as good as PS4 because the PS5 API is new and a bit different than PS4. Basically From Software have difficulty with new API...

EDIT: But PS5 API is easier to deal with than Xbox less changes. Imo there is no reason they can't run Elden Ring at a good 60 fps on current gen consoles and even less reason of performance problem on PC.

A much more interesting benchmark is the matrix awakens demo where PS5 and Xbox Series X are at same level with very modern rendering technology with Nanite, RT GI, RT reflections, Virtual shadow maps.
 
Last edited:
Back
Top