Nah a 3600 is in the 40s often - Not at all Like ps5That actually looks fairly reasonable to me. A 3600x seems to be coming in around 60fps which is what you would expect vs the PS5.
GPU performance on the other hand...
Nah I waited 2 hours and the percentage Had Long reached 100. I would never start a Game before shaders are done.The game is probably compiling shaders in the background as you play despite having waited for the compilation to finish at the beginning.
In this comparision the resolution setting of the PS5 version of Last of Us doesnt look like real 4K:
Not in the actual gameplay part at least.
For reminder PS5 CPU performance is around 1700x (or 2700). It's an underclocked Zen 2 mobile variant with much less cache than the desktop Zen 2 CPU. Maybe to be considered when using middle or high end CPUs and use this to compare the GPUs.That actually looks fairly reasonable to me. A 3600x seems to be coming in around 60fps which is what you would expect vs the PS5.
GPU performance on the other hand...
For reminder PS5 CPU performance is around 1700x (or 2700). It's an underclocked Zen 2 mobile variant with much less cache than the desktop Zen 2 CPU. Maybe to be considered when using middle or high end CPUs and use this to compare the GPUs.
Those benchmarks should always be used to compare the combination of CPU + GPU (even ram). So say a 3600x + 3600 performs XX% above PS5 1700x + 5700 OCed. On PC we can directly compare GPUs because we can easily use the same exact MB + ram + CPU combination so the comparisons between GPU become fair which is not possible when compared agains consoles.
(albeit we'll see if that Oodle dll replacement is actually beneficial in terms of at least precompile time)
When it comes to MP titles, performance isn’t compromised on PC, it’s compromised on console.
IIRC Death Stranding also performs very closely to 3060ti. Seems like PS centric games simply like the hardware. Maybe they exploit a specific strength of PS5 that shifts the balance towards there. Just a guess IMO.
I have no qualms with PS5 matching my 3070 or 3060ti.
If performance creep has happened that much already its worrying for the future. A 3060ti is 33% faster than a 5700xt and the generation has barely even started.IME with my 3060 and PS5, without using DLSS and not RT, in pure rasterized performance you will likely need a 3060ti fairly often to keep up or surpass a PS5, my 3060 is more of a toss-up. Even in a highly optimized PC title like Doom Eternal, I cannot maintain the same performance as the PS5 without using DLSS (not a loss mind you, looks better). So it's not that uncommon.
Death Stranding with DLSS + force aniso looks significantly better than the PS5 version btw, but goddamn that intermittent camera stutter bug.
Yea there's definitely some spectrum involved here, and you're right to look at my choice of words under a lens.I'm sure they're not as optimized 'as they could be', in terms of a first party studio like ND with an exceeding long development time, budget and crunch, sure - I'm not so sure a game lacking quite that level of hyper-focus should be considered 'compromised' though.'
If performance creep has happened that much already its worrying for the future. A 3060ti is 33% faster than a 5700xt and the generation has barely even started.
Series X performing like a 2080 is spot on though. Around launch time frame PS5 only performed like a 2080 in Valhalla which runs quite slowly on Nvidia GPUs. In most other games it ran like a 2070/5700xt.Even before launch DF was noting that the SX was basically performing like a RTX 2080. It's really not that much creep is what I'm saying though, it can vary significantly per title of course but they've always been in that ballpark.
Series X performing like a 2080 is spot on though. Around launch time frame PS5 only performed like a 2080 in Valhalla which runs quite slowly on Nvidia GPUs. In most other games it ran like a 2070/5700xt.
In this game it’s almost a 100% jump.If XSX is ~2080 than PS5 being ~2070/Super is about right in a typical game but in some games like AC:Valhalla and Sony exclusives I expect PS5 to see a decent jump above that ~2070/Super base line.
In this game it’s almost a 100% jump.
So you are saying some custom hardware on PS5 like custom I/O and audio units are actually reducing pressure on its CPU and greatly improving the performance of the game (compared to a hardware lacking both of those custom silicon blocks)? That's a bold claim.And the CPU is doing more stuff than the CPU on PS5. They use Oodle Kraken and they decompress on the CPU. Other stuff like 3d audio are done on the Tempest engine where are they done on the CPU or the GPU on PC? This is probably one of the reason with 3600x doesn't performing as good as the PS5 here out of better optimization on PS5 and thinner API.
Maybe part of the VRAM usage comes from the fact they need to tailor a streaming solution working on HDD too.
PS5 loses in every theoretical metric to a 2080 outside of fill rate IIRC. Turing also has better scheduling of FP/INT math.But this game is a hot mess and an extreme outlier in performance terms.
From a raw throughputs perspective it really shouldn't be surprising at all that the PS5 is often playing in the 2080's league even before considering console optimisations (so a little slower than the 3060Ti but a little faster than the 3060). The biggest surprise for me this generation thus far has been how infrequently that seems to have been the case. And more still, how badly the Series X has underperformed which on paper should be playing up there pretty close to 2080Ti levels.
But this game is a hot mess and an extreme outlier in performance terms.
If the PS5 is using a combo of Medium and High settings then it's not a jump at all.In this game it’s almost a 100% jump.
In this game it’s almost a 100% jump.
PS5 loses in every theoretical metric to a 2080 outside of fill rate IIRC. Turing also has better scheduling of FP/INT math.