BTW not sure if it was ever brought up in other threads, but NxGamer recently recently did a comparison on the PS5 release of Death Stranding: Directors Cut and the original on the PC (
https://www.youtube.com/watch?v=GpfXfB4WdM0).
It's illustrative in some points and reveals some details that early analysis of the PC version with DLSS vs. PS4 Pro version may have missed no doubt - for example showing that there were some ambient occlusion/shadowing areas that are darker on the Pro. Hell, I went out and purchased DS PS4 Pro for the cheap upgrade to the PS5 version despite owning it on the PC already due to it having unresolved issues with frametimes on my PC - which he also goes into somewhat (specifically on the broken vsync, but more on that below), it's far superior (at least for me) on the PS5 in many aspects. It's probably the #1 game now that shows the PS5's rasterization power in the best light, you basically need a 2080TI to match the performance of it at 4k, especially with the latest patches which have even increased res mode performance on the PS5 since probably this video was made.
But...there are some issues I have with the comparison at times, and some points he brings up are pretty eye-popping in their reasoning that veer close to comical.
First off - two different versions. Directors Cut is far more than the PS4 Pro version just being unlocked in frame rates. Yes, he does mention it's now no longer using checkerboarding, but then mentions how there's greater texture detail being exposed as opposed to even DLSS Quality mode. The problem with this as I noticed on ResetEra (
https://www.resetera.com/threads/de...ital-foundry-tech-review.490447/post-74036659) is that DC on the PS5 has different levels of contrast and also makes some changes to the post-processing - it exposes significantly more fine detail than
native 4k on the PC. This is not due to DLSS, rather it's a change in art direction, TAA implementation and/or added sharpening.
Secondly, his PC screens clearly exhibit a lack of aniso, I believe the DS on the PC is broken in this regard, you have to force aniso 16x through the CP. Worth mentioning as the usual extra hassle of the PC no doubt, but still it should be immediately obvious:
He also goes into the problems of DLSS, which sure! They exist - except the main complaint he has (ghosting on objects with no motion vectors) can be fixed by dropping in a more recent DLL. Again, worth it to bring up that hassle you have to do this, but this is not exactly esoteric information at this point to anyone who covers PC gaming. He chocks up this ghosting to DLSS as a "heavy TAA solution" then that's not entirely the cause, especially in these most egregious examples. Overall, he feels DLSS and checkerboarding are give and take with regards to quality, which I don't
necessarily consider an outlandish statement, I've often felt CB is disregarded by the PC community too readily (especially on the PS5 as 60fps can improve the final resolve) - but he also does not note the significant increase with pixel-popping you get with details in the distance. Alex covered this in his video on DS DLSS vs. checkerboarding and it's pretty blatant - DLSS can blur things in motion more, but that comes at the advantage of significantly more stability. Still, it's largely a PS5 vs. PC video so it's not a huge oversight imo.
Vsync: He makes a good chunk of the performance comparisons with vsync on with the PC, which is perfectly valid. And to his credit, he does notice the GPU utilization % dropping on the PC with vsync enabled, which has always been the case with DS's messed up vsync implementation - it's absolute garbage.
However, once again it's his
reasoning for why he's witnessing something is occurring where his analysis goes off the rails.
Instead of DS just having a borked frame buffer implementation - which is does - he states that it's due to
DX12 driver overhead as the cost of vsync on the command queue. Like, what? First off, he specifically mentions that he's choosing the cutscene because it's GPU limited and at 'no part is the CPU the bottleneck here' - so what does 'driver overhead' have to do with this then? Again, it's a broken vsync implementation and it deserves to be called out (there are many threads of reddit/Steam forums complaining of the microstuttering (which occurs even with a solid 60fps lock) which 505 games have done nothing for. This is I feel another example of NXGamer beating his favourite drum about "PC overhead", which surely does exist - but he often tries to shoehorn it in when it doesn't necessarily apply.
It's not the only game that can exhibit a poor vsync (A Plagues Tale for example - which you can see
from his own comparison video on the game where he failed to pick up on it despite RT showing him the low GPU usage), the problem in particular is that since it's DX12 you can't fix it like you can with other games by forcing fast sync - but by doing so I'm not magically freeing up 'driver overhead', I'm just overriding the developers poor frame buffer management.
Since he believes vsync, by it's very nature, introduces this 'overheard', he then goes on to speculate that the
PS5 itself is of course suffering from this vsync overhead as well of 10-20% and could be performing even better than his comparison shows because of it.
When you see your GPU utilization % jump around from 80% to 95% where you don't have a corresponding CPU bottleneck, that's nothing to do with your 'command queue' being stalled due to 'driver overhead' - it's due to the game having a broken triple buffered implementation of vsync. If this was truly an issue of the driver being strained you wouldn't be able to fix these occurrences on the PC by forcing vsync through an external utility or the CP.
While still on the topic of vsync, he further compounds his misunderstanding by going on to say that the performance uplift you see with vsync off is
what VRR is designed to fix - not fluctuating framerates with a vsync cap that fall below the cap. Sorry, but this is absolute nonsense. You
should enable vsync with VRR displays. I realize this is a long debated topic but it's been covered for a while now - VRR will
not always eliminate tearing, only vsync can, and vsync also does not induce additional latency with it enabled when vsync is running.
The rest of the video is him congratulating himself on how he takes the time to do deep analyses and get things right instead of rushout of the gate to be the first. Um...ok.