Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

I may be oversimplifying it so I'm happy for you to explain why this is wrong, but as I see it, a game running on PS5 in BC mode is not taking advantage of either the specific architectural strengths and weaknesses of the RDNA architecture or the GPU's more modern feature set. However it is still able to apply it's full compute and rasterization resources to running the game - just in a less efficient way than for a game built for that specific GPU.
This is my understanding as well. The bandwidth, all CU's, and clock speed in a BC PS5 title are fully utilized - I've seen some people assuming that BC means that the GPU is downclocked or only half of the CU's are being used, that's not true.

The biggest hit to performance is probably the GPU has to basically be in a "GCN" mode to avoid shader recompiles being necessary, effectively making it a super-clocked PS4 Pro GPU. That could be significant depending upon the game.
 
Is that due to being a 'native' PS5 title, or engine improvements? We don't know the extent of either way. Guess we'll see when it hits the PC.

The touryst reach 8k 60 fps on Ps5 and shinen told it is at least partially because it is a native PS5 title and because of the API too.
 
I may be oversimplifying it so I'm happy for you to explain why this is wrong, but as I see it, a game running on PS5 in BC mode is not taking advantage of either the specific architectural strengths and weaknesses of the RDNA architecture or the GPU's more modern feature set. However it is still able to apply it's full compute and rasterization resources to running the game - just in a less efficient way than for a game built for that specific GPU.

And that's exactly how a modern PC GPU would run a port of a PS4 game. NXG even went into some depth in his video around that and the reasons for using DX11 and having to work around the UMA etc... Had the game been built for modern DX12U capable PC's with discrete memory pools to leverage all their modern features and capabilities, and tuned for RDNA2 and Turing/Ampere architectures from the ground up then it would obviously perform better there too.
The main reason is that also on ps5 native game can not fully utilize its gpu arch. (its generaly very broad statement - optimizagion can always be better) but with bc its just quite big performance hit. Recomand to watch df ghost of tsushima dc - 1.44x more pixels and full stable 60 native app when in photomode (arificial but still benchmark) bc mode can drop to 40.
 
The main reason is that also on ps5 native game can not fully utilize its gpu arch. (its generaly very broad statement - optimizagion can always be better) but with bc its just quite big performance hit. Recomand to watch df ghost of tsushima dc - 1.44x more pixels and full stable 60 native app when in photomode (arificial but still benchmark) bc mode can drop to 40.

But the jump in performance between BC mode and native (which no one is denying) doesn't mean PC's wouldn't also benefit from more targeted code designed around modern architectures and features while accounting for PC memory architecture. So I still don't see the false equivalence.
 
Ignoring DLSS while useful for the raw performance comparison does unfairly penalise the Nvidia GPU because it spends silicon and die space on those tensor cores which could have otherwise have been spent on more CU's for example. So putting all of that silicon to work - as supported by the game - by turning on DLSS seems like a fairer way to compare the two. DLSS Balanced looked to have similar or slightly better image quality to the PS5 solution in those comparisons so how would the experience compare using that for example? Could the 2070 hold 60fps at higher than PS5 settings perhaps?
In his DLSS chapter as well he spends an inordinate amount of time showing and talking about DLSS Ultra Performance mode, I mean what exactly is the point? Of course it looks bad, no one regards that mode as anything but a joke to promote "8k" on the 3090 release. He also states that due to 'motion blur' that any differences in reconstruction methods will not really be relevant in fast action scenes, but the example he shows when Kratos is chopping the tree clearly show that the 'motion blur' on the PS5 is actually at least in part due to the blurriness of the checkerboard resolve vs DLSS.

One aspect I didn't see mentioned as well was what did he set DLSS sharpening to? That slider was added in a patch a week ago. His DLSS shots look a little blurrier than other comparisons I've seen, especially compared to DF's video which was done before built-in DLSS sharpening was able to be modified by the end user. I'm wondering if he has DLSS sharpening set to 0, which would explain why he thinks FSR Quality looks comparable to DLSS balanced, as FSR relies on sharpening heavily to maintain detail.

I will agree that in actual gameplay, for most people playing this 6+ ft away on a TV, the differences in most reconstruction methods are probably going to be pretty minor, and as I've said before I like checkerboarding overall and think it does a very good job in many titles. The not-so-subtle dig at DF with the "zooming in 300%" comments though are not necessary, if those minor defects are not that relevant at normal viewing distances then you might as well benchmark the game with DLSS performance mode vs the PS5 to get an 'accurate' gauge of how the platforms compare.
 
But the jump in performance between BC mode and native (which no one is denying) doesn't mean PC's wouldn't also benefit from more targeted code designed around modern architectures and features while accounting for PC memory architecture. So I still don't see the false equivalence.
same as a ps5 native games, its not that all natvie games that are crossgen 100% utilize ps5 gpu arch. especialy when conversion is done by smaller teams so dont buy this is same as bc mode
 
same as a ps5 native games, its not that all natvie games that are crossgen 100% utilize ps5 gpu arch. especialy when conversion is done by smaller teams so dont buy this is same as bc mode

I take your point and yes its not entirely the same, but nonetheless, if this were a game built from the ground up with PCs and modern architectures in mind then it would also gain performance wise. Perhaps not as much as a native PS5 game vs BC though (but this is highly speculative).

One point that does occur to me though is why didn't he compare the PC performance with vsync off to that one scene on the PS5 that he mentioned seeing drops to 55fps? Surely that would have been a far more suitable method of comparison? I understand it was due to spoilers but why not just test it and tell us the result?
 
Is that due to being a 'native' PS5 title, or engine improvements? We don't know the extent of either way. Guess we'll see when it hits the PC.
The profiles done on Death stranding on PC are very interesting. Notably the game is mainly pushing the L1 cache (that GCN lacks). My guess is they must have optimized the game on PS5 in order to properly use the new RNDA L1 cache. And we know this is an area where PS5 shines.

I also think this could be the main reason why the Matrix Awaken demo on PS5 is slightly outperforming the XSX version even when that engine is compute heavy and the XSX has being optimized by Microsoft developers. We'll see more of the same thing when more games start to correctly use RDNA features on consoles (or if they simply use UE5, the only next-gen engine yet), and L1 cache is actually the most important new feature which can bring big performance improvements if you correctly use it.
 
I take your point and yes its not entirely the same, but nonetheless, if this were a game built from the ground up with PCs and modern architectures in mind then it would also gain performance wise. Perhaps not as much as a native PS5 game vs BC though (but this is highly speculative).

One point that does occur to me though is why didn't he compare the PC performance with vsync off to that one scene on the PS5 that he mentioned seeing drops to 55fps? Surely that would have been a far more suitable method of comparison? I understand it was due to spoilers but why not just test it and tell us the result?
On console is vsync on so thats probably why, he did once nice comparison (I think in deathstradning) vsync influence on performance, quite big
 
On console is vsync on so thats probably why, he did once nice comparison (I think in deathstradning) vsync influence on performance, quite big

Consoles usually use adaptive vsync and tear when they drop below the refresh don't they? In any case, as Alex pointed out above, massive performance losses with vsync enabled are not a normal thing and certainly wouldn't be tolerated in a console title so I'd say it's a faulty assumption to assume the PS5 would be suffering from the same vsync performance losses he seems to demonstrate here on the 2070.

But putting all that aside, even a simple vsync on test in that scene would have been illuminating. He points out at one point that "with unlocked frame rates the difference would be even bigger" (in the PS5's favour naturally, and of course ignoring his earlier section where the 2070 was comfortably above 60 fps with unlocked frame rates) but why assume when you actually have a scene in the game where the PS5 is effectively running unlocked where a direct comparison can be made?
 
Using the seemingly broken in game Vsync as a point of comparison is absolutely flawed, especially when enabling it through the control panel fixes the issue. I don't agree that the difference between BC mode and a natively coded game is at all similar to the PC port not using DX12. DX12 is typically slower than DX11 for one, and secondly no PC games are ever coded specifically around the capabilities of a GPU. Even in the absolute best cases where we can compare, DX12 is maybe 5% faster under GPU limited scenarios. In games where CB is used on the console, I think it’s fair to use a DLSS mode that offers comparable IQ while targeting the same output resolution in the absence of said CB mode being available in PC. I do not think DLSS should ever be the default comparison point against native rendering though.
 
Last edited:
Consoles usually use adaptive vsync and tear when they drop below the refresh don't they? In any case, as Alex pointed out above, massive performance losses with vsync enabled are not a normal thing and certainly wouldn't be tolerated in a console title so I'd say it's a faulty assumption to assume the PS5 would be suffering from the same vsync performance losses he seems to demonstrate here on the 2070.

But putting all that aside, even a simple vsync on test in that scene would have been illuminating. He points out at one point that "with unlocked frame rates the difference would be even bigger" (in the PS5's favour naturally, and of course ignoring his earlier section where the 2070 was comfortably above 60 fps with unlocked frame rates) but why assume when you actually have a scene in the game where the PS5 is effectively running unlocked where a direct comparison can be made?
dont understand, how he could benchamark unlocked frames on ps5 vs unlocked on pc ?
 
I don't agree that the difference between BC mode and a natively coded game is at all similar to the PC port not using DX12. DX12 is typically slower than DX11 for one, and secondly no PC games are ever coded specifically around the capabilities of a GPU. Even in the absolute best cases where we can compare, DX12 is maybe 5% faster under GPU limited scenarios.

This isn't what I was saying. I'm saying that if a remake of God of War had specifically been developed for the PS5 and PC as a multiplatform title from the outset with RDNA as it's baseline (but the same graphics as the current version), that PC version would perform better than the current PC version on modern PC hardware. Essentially the PC version also suffers because it's a port of a game developed for a GCN1.1 hUMA architecture. It it were a game developed to be UMA agnostic with RDNA as it's baseline then it would be more performant than the current verision on the latest PC GPU's

Isn't this the same argument console gamer make all the time? I.e that the PC will struggle more with ports of past PS4 exclusives because they were tailored specifically for that system? I'm agreeing, but then the converse must also be true, rebuild that game from the outset to account for NUMA and tailored to modern DX12U class GPU's (not necessarily using DX12 itself) and it will perform better.
 
This isn't what I was saying. I'm saying that if a remake of God of War had specifically been developed for the PS5 and PC as a multiplatform title from the outset with RDNA as it's baseline (but the same graphics as the current version), that PC version would perform better than the current PC version on modern PC hardware. Essentially the PC version also suffers because it's a port of a game developed for a GCN1.1 hUMA architecture. It it were a game developed to be UMA agnostic with RDNA as it's baseline then it would be more performant than the current verision on the latest PC GPU's

Isn't this the same argument console gamer make all the time? I.e that the PC will struggle more with ports of past PS4 exclusives because they were tailored specifically for that system? I'm agreeing, but then the converse must also be true, rebuild that game from the outset to account for NUMA and tailored to modern DX12U class GPU's (not necessarily using DX12 itself) and it will perform better.
Yes PC will fare better when a title is not originally designed for the PS and later ported. I don't see this as very comparable to a game being run in BC mode on PS5 however. GOW port running under DX11 is certainly extracting a fare bit more of the performance potential of PC GPUs than a BC title is of a PS5.
 
dont understand, how he could benchamark unlocked frames on ps5 vs unlocked on pc ?

Because in those moments where the PS5 falls below the vsync limit, it is not constrained by that limit and so is effectively running unlocked in that moment.

This is a point where it could be directly measured against an unlocked 2070. It's not really a new concept, sited including DF have been doing this for years. I'm pretty sure NX himself has done this in the past so why not here?

His intention isn't to provide a benchmark showing exact performances, just a brief look at any differences between a PS5 and the PC parts he has available.

If you're going to do a performance comparison at least do it properly. Especially if you're going to make seemingly unfounded references to 30-40% performance advantages.
 
Back
Top