Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

I've not played Minecraft for a good while and never on console.

How was the RT? Were there are obvious reasons why they might have removed it?
Think of Quake 2 RTX, but Minecraft. Every block has material qualities, and the game is path traced. The performance penalty is large.
 
In Death Stranding we can see the PS5 slightly outperfoming the 3060 ti by about 10%. Comparing one exact cutscene in the elanalistadebits video: 3060ti starting here, PS5 scene starting her. Also the fps counter on the PC side is actually not working properly and misses dropped frame. For instance here it says 57fps when I actually counted 11 missed frames, making a 49fps while the fps counter seems to work properly on PS5 in the same location. Anyways, PS5 is outperforming the 3060ti in that scene and that makes the PS5 performing about on par with a 3070.

First identical image comparison:
WCcnnjV.png

gR9Q6Fy.png


Second identical image comparison:
OsuzAM6.png


0CFqU3d.png
 
In Death Stranding we can see the PS5 slightly outperfoming the 3060 ti by about 10%. Comparing one exact cutscene in the elanalistadebits video: 3060ti starting here, PS5 scene starting her. Also the fps counter on the PC side is actually not working properly and misses dropped frame. For instance here it says 57fps when I actually counted 11 missed frames, making a 49fps while the fps counter seems to work properly on PS5 in the same location. Anyways, PS5 is outperforming the 3060ti in that scene and that makes the PS5 performing about on par with a 3070.

Techpowerup have a 6700XT performing the same as a 3060ti in Death Stranding and 13% slower than a 3070.

So PS5 is even outperforming faster RDNA2 based GPU's in the game.

I wouldn't put it down to clock speed or PS5 maintaining boost properly, I would put it down to being an old engine that benefits from having console level optimization more than your typical game.
 
In Death Stranding we can see the PS5 slightly outperfoming the 3060 ti by about 10%. Comparing one exact cutscene in the elanalistadebits video: 3060ti starting here, PS5 scene starting her. Also the fps counter on the PC side is actually not working properly and misses dropped frame. For instance here it says 57fps when I actually counted 11 missed frames, making a 49fps while the fps counter seems to work properly on PS5 in the same location. Anyways, PS5 is outperforming the 3060ti in that scene and that makes the PS5 performing about on par with a 3070.

First identical image comparison:
WCcnnjV.png

gR9Q6Fy.png


Second identical image comparison:
OsuzAM6.png


0CFqU3d.png

Techpowerup have a 6700XT performing the same as a 3060ti in Death Stranding and 13% slower than a 3070.

So PS5 is even outperforming faster RDNA2 based GPU's in the game.

I wouldn't put it down to clock speed or PS5 maintaining boost properly, I would put it down to being an old engine that benefits from having console level optimization more than your typical game.

:rolleyes: The settings aren't matched.

- PS5 and PC share most settings to the maximum. However, the draw distance is longer on PC.

The games "Default" menu setting which is an exact match for the PS5 are not the same as the games "Very High" / Maximum setting which your comparison video seems to be using.

We've already seen from the much more detailed Digital Foundry article that the PS5 is performing very marginally above a standard 2080, and thus slower than a 2080S or 3060Ti. There was also a direct comparison to the 2080Ti (equivalent to a 3070) which was very clearly comfortably faster than the PS5.
 
nmFvAmUr_o.jpg
We've already seen from the much more detailed Digital Foundry article that the PS5 is performing very marginally above a standard 2080, and thus slower than a 2080S or 3060Ti. There was also a direct comparison to the 2080Ti (equivalent to a 3070) which was very clearly comfortably faster than the PS5.

And that's in Death Stranding. The game is more optimized towards AMD (for obvious reasons), its close enough to a 6600XT when running the same settings. In general though, the PS5 is around RTX2070 when not considering current generation features like ray tracing, dlss and newer rendering methods.
These claims that the PS5 is performing like a 3070, or higher (i have seen 3080 and even 3090) are so far from reality its just funny. It seems some want the forums to sink to that level.
nmFvAmUr
 
Game performance can vary significantly between titles yes, but citing Gamers Nexus DMC V comparisons is hardly informative. They fucked it up, which was revealed about 10 minutes after they posted it. They believed DMC 5's 120fps non-RT mode on the PS5 was 1080p - it's a reconstructed 4k.

If we're going to ding other youtubers/posters for misleading conclusions, maybe don't pick an example where GN does the same thing. They're an excellent site for what they focus on, which is PC DIY cooling and hardware, but their console coverage has demonstrated this area is not their forte.
 
Last edited:
nmFvAmUr_o.jpg


And that's in Death Stranding. The game is more optimized towards AMD (for obvious reasons), its close enough to a 6600XT when running the same settings. In general though, the PS5 is around RTX2070 when not considering current generation features like ray tracing, dlss and newer rendering methods.
These claims that the PS5 is performing like a 3070, or higher (i have seen 3080 and even 3090) are so far from reality its just funny. It seems some want the forums to sink to that level.
nmFvAmUr
Gamers Nexus tests are beyond useless because it tells us nothing about the ps5, only that it performs this way in DMC. There are a multitude of reasons why it could perform in that manner and many of those reasons may not be hardware related. Using that as the opening to your baseless assertions is a great way to invalidate your argument.

If people were really interested in finding the PC equivalents of console GPUs, a suite of tests focused on testing just the GPU would need to be run. Unfortunately, the only way to do that would be to have access to the console devkits. All the tests done by DF and GN are beyond useless and belong in the garbage. When you test a PC game using console settings, you're not testing the gpu, you're testing the whole system. Trying to draw conclusions about a GPU from such a test is foolish as the test methodology is flawed. All of this is to say that the only people who know the PC equivalents of these console gpus are amd, Microsoft/Sony, and developers who have access to these devkits. Pretty much everyone that falls into those categories are NDA'd out the wazoo so, all we're left with is people making uneducated guesses based on their fundamentally flawed test methodologies.
 
If we're going to ding other youtubers/posters for misleading conclusions

If we start seeing claims that a PS5 is performing equal to a 3070 (due to a single scene, in a amd favoured game, with different settings), nothing is out of reach anymore. If some decide to sink the discussions that much, thats ok.

Gamers Nexus tests are beyond useless because it tells us nothing about the ps5, only that it performs this way in DMC.

I know, that video is in NXG range of usefullness, we all remember that GN video, it was to demonstrate how use-less some claims can be.

If people were really interested in finding the PC equivalents of console GPUs, a suite of tests focused on testing just the GPU would need to be run. Unfortunately, the only way to do that would be to have access to the console devkits. All the tests done by DF and GN are beyond useless and belong in the garbage. When you test a PC game using console settings, you're not testing the gpu, you're testing the whole system. Trying to draw conclusions about a GPU from such a test is foolish as the test methodology is flawed. All of this is to say that the only people who know the PC equivalents of these console gpus are amd, Microsoft/Sony, and developers who have access to these devkits. Pretty much everyone that falls into those categories are NDA'd out the wazoo so, all we're left with is people making uneducated guesses based on their fundamentally flawed test methodologies.

I agree with all that.
 
If we start seeing claims that a PS5 is performing equal to a 3070 (due to a single scene, in a amd favoured game, with different settings), nothing is out of reach anymore. If some decide to sink the discussions that much, thats ok.
Death Stranding's Default vs. Very High settings are extremely minor in actual performance savings. You are not going to be able to jump down a GPU model wrt what you need to match a PS5 just by playing on default, which only tamps down a couple of settings. You would still need a 3060TI to match the PS5's performance even on default at native res, and it still likely wouldn't necessarily match the performance.

Now that's not saying anything of which is the better platform for this particular game even, as that's pretty obvious - it's the PC due to DLSS and 120+ fps modes, no question. And yes, one game in one scene is not indicative of much of anything for just comparing relative raster performance between the two platforms, as we have many examples of other games showing less scaling than DS, which is one of the best on PS5 relative to the raw performance it delivers.

However, to say then "anything goes" because someone may have 'handicapped' the PC by ~5% because they're comparing very slightly lower detail settings on the PS5 vs the PC and then to illustrate that, use GN comparing a game running at half the resolution on the PC as the PS5 is ridiculous. Degree matters.
 
Death Stranding's Default vs. Very High settings are extremely minor in actual performance savings. You are not going to be able to jump down a GPU model wrt what you need to match a PS5 just by playing on default, which only tamps down a couple of settings. You would still need a 3060TI to match the PS5's performance even on default at native res, and it still likely wouldn't necessarily match the performance.

Now that's not saying anything of which is the better platform for this particular game even, as that's pretty obvious - it's the PC due to DLSS and 120+ fps modes, no question. And yes, one game in one scene is not indicative of much of anything for just comparing relative raster performance between the two platforms, as we have many examples of other games showing less scaling than DS, which is one of the best on PS5 relative to the raw performance it delivers.

However, to say then "anything goes" because someone may have 'handicapped' the PC by ~5% because they're comparing very slightly lower detail settings on the PS5 vs the PC and then to illustrate that, use GN comparing a game running at half the resolution on the PC as the PS5 is ridiculous. Degree matters.

I agree with you in special the latter half of your post. Seeing claims that PS5 is performing like a 3070 because of a select scene is just something were used to from the poster, but still, its not really indicative that you'd need a 3070 to match the PS5 even in this particular game. Hand picking scenes can be done in any comparison.
 
All the tests done by DF and GN are beyond useless and belong in the garbage. When you test a PC game using console settings, you're not testing the gpu, you're testing the whole system.

I disagree with this. You are of course testing the whole system, but if you test and confirm that the GPU is the bottleneck in that system before taking your measurements then you are effectively directly comparing GPU performance. Alex did that pretty clearly in his video. This game is no-where near CPU limited, either on PC or PS5.

Death Stranding's Default vs. Very High settings are extremely minor in actual performance savings. You are not going to be able to jump down a GPU model wrt what you need to match a PS5 just by playing on default, which only tamps down a couple of settings. You would still need a 3060TI to match the PS5's performance even on default at native res, and it still likely wouldn't necessarily match the performance.

The original games differences may have been extremely minor but do we have any metrics for the Directors Cut? Alex already showed clearly that (scene dependent) the 2080 ranges from 96.6% as performant as the PS5 in this game to 12% faster. And note that the scene where the 2080 was 12% faster is extremely similar (if may have been the same, I'm struggling to tell) to the scene in the video posted by @Globalisateur showing the PS5 outperforming the 3060Ti, which is a comfortably faster GPU than the 2080. We know the settings are not matched in that video so it stands to reason that's the culprit for this seeming mismatch.

So going by Alex's metrics which were properly settings matched and accounted for the games weird vsync behaviour, the 2080 is at worse, 96.6% as performant as the PS5. Meanwhile, according to this, the 2080 is only 92% as performant as the 3060Ti on average. I don't think there's any evidence here to suggest that in this game, which seems to be one of the more favourable examples of it's performance, the PS5 is able to match, let alone exceed the 3060Ti's performance.
 
I disagree with this. You are of course testing the whole system, but if you test and confirm that the GPU is the bottleneck in that system before taking your measurements then you are effectively directly comparing GPU performance. Alex did that pretty clearly in his video. This game is no-where near CPU limited, either on PC or PS5
Sorry but you can never directly compare GPU performance using the testing methodologies used by DF or GN. Using their test methodology, you can never isolate gpu to test its performance. If all other components of the test cannot be kept static, the test is beyond useless. Both systems use different APIs, different memory sub systems, different cpus, etc. For example, the ps5s api might be more performant inflating the gpus capabilities or vice-versa. All they're doing is testing a ps5 against a PC of their choice. They're not testing GPUs at all.
 
of course it'll never be 100% accurate, but the purpose of those tests initialy was not to fuel plateform wars, it's a way to see how different systems with similar specs on paper behave in comparison to each other, in an informative way.

We could also argue that devs could spend years on their game optimizing and trying to milk every bit of power they can out of each system, that that's a neverending story.
 
People in the comments of these videos always seem to miss the point. The comparison in DFs death stranding video is just to show the max utilization of the ps5 GPU vs PC equivalent.

The people saying the comparison wasn't fair because DLSS wasn't used seem to not understand what the video was intended to be about. It's not as if alex pulled out performance mode on PS5 either and there is a specific reason for that
 
Unfortunately some people here are always about having their favorite plastic box "win" instead of looking the comparisons made by sites like DF for what they are. Hint: They are not an awards show and your e-peen will stay the same size regardless of the results.
 
The original games differences may have been extremely minor but do we have any metrics for the Directors Cut? Alex already showed clearly that (scene dependent) the 2080 ranges from 96.6% as performant as the PS5 in this game to 12% faster. And note that the scene where the 2080 was 12% faster is extremely similar (if may have been the same, I'm struggling to tell) to the scene in the video posted by @Globalisateur showing the PS5 outperforming the 3060Ti, which is a comfortably faster GPU than the 2080. We know the settings are not matched in that video so it stands to reason that's the culprit for this seeming mismatch.
The difference between Default and Very High is one higher notch of model detail and more memory caching for streaming (still below 5GB needed at 4k max). That's it. DC does not change this. The original only showed scaling when going down to low as well.

I've just tried it with my GTX 1660, the difference was 1% higher GPU utilization, which considering the differences (basically invisible) is not surprising. Perhaps I could find more stressful scenes that would show a difference, but considering the DC still has the same problems with microstuttering on my PC that the original did leading me to finally give up and get it on my PS5, doesn't look like the engine has changed much at all (not even in-game anisotropic filtering, has to be forced from the CP, as well as the triple buffering/GPU utilization issue has not been remedied).
 
Last edited:
Sorry but you can never directly compare GPU performance using the testing methodologies used by DF or GN. Using their test methodology, you can never isolate gpu to test its performance. If all other components of the test cannot be kept static, the test is beyond useless. Both systems use different APIs, different memory sub systems, different cpus, etc. For example, the ps5s api might be more performant inflating the gpus capabilities or vice-versa. All they're doing is testing a ps5 against a PC of their choice. They're not testing GPUs at all.

I understand what you're trying to say here but I still disagree. If you literally want to compare how powerful the GPU's are against each other in total isolation from all other factors then you can simply look at their paper specs vs RDNA2 and then compare RDNA 2 to NV GPU's on the PC side. But that's not relevant to the real world where arguably the memory subsystem and the graphics API can be considered as part of the overall GPU performance profile. I'm not really interested in how fast the PS5 GPU might be if you coupled it with 16GB of dedicated GDDR5 and used DirectX12 and Windows 11 with it. I want to know how it performs with the PS5's API's and memory. i.e. what GPU you need in the PC space to get an equivalent experience. Removing the CPU from that equation is pretty trivial in a game like Death Stranding which is demonstrably heavily GPU limited.

And the fact that in Digital Foundries case at least (i.e. where settings and scenarios are properly matched) the PS5 generally slots right where you would expect it to based on it's specs is testament to these comparisons being far from "beyond useless" and in actually, pretty damn accurate.
 
The difference between Default and Very High is one higher notch of model detail and more memory caching for streaming (still below 5GB needed at 4k max). That's it. DC does not change this. The original only showed scaling when going down to low as well.

I've just tried it with my GTX 1660, the difference was 1% higher GPU utilization, which considering the differences (basically invisible) is not surprising. Perhaps I could find more stressful scenes that would show a difference, but considering the DC still has the same problems with microstuttering on my PC that the original did leading me to finally give up and get it on my PS5, doesn't look like the engine has changed much at all (not even in-game anisotropic filtering, has to be forced from the CP, as well as the triple buffering/GPU utilization issue has not been remedied).

Out of curiosity, you bought Death Stranding on PC (running a GTX1660), then bought it again on PS5 because of micro stuttering, and then bought the Directors Cut on PC again? Why?

In any case we have a meticulously settings matched comparison from Digital Foundry showing the PS5 ranging from barely faster in most cases to slower in some cases than a 2080. And then we have another non-settings matched video showing the PS5 performing comfortably above a 2080S. How would you account for the difference in result if not the difference is settings?
 
Back
Top