Uncharted: Legacy of Thieves Collection [PS5, PC]

After upgrading my lowly Core i7 3770K to a shiny Ryzen 7800X3D, I retested the game on my 2080Ti. At both native 4K and 2K resolution, High settings (same as PS5), I found the game to consistently perform between 10fps to 20fps higher than PS5, depending on the situation of course. Which makes the PS5 perform at the level of 2080/2080S in this game.
 
Realistically, Naughty Dog should have never even entertained the idea of doing service games. Sony should have smartly kept them entirely focused on multiple AAA blockbuster SP games, and have a completely outside studio take a chance with some of their IP in the GaaS space.

Naughty Dog should have been a well oiled machine at this point, cranking out amazing games.

Easier said than done I know.. but Sony should have never had them take their eye off the ball.
 
After upgrading my lowly Core i7 3770K to a shiny Ryzen 7800X3D, I retested the game on my 2080Ti. At both native 4K and 2K resolution, High settings (same as PS5), I found the game to consistently perform between 10fps to 20fps higher than PS5, depending on the situation of course. Which makes the PS5 perform at the level of 2080/2080S in this game.

That low? I thought this game was more of an outlier than that in performance terms. I.e. performing very near or even above 2080TI level.
 
After upgrading my lowly Core i7 3770K to a shiny Ryzen 7800X3D, I retested the game on my 2080Ti. At both native 4K and 2K resolution, High settings (same as PS5), I found the game to consistently perform between 10fps to 20fps higher than PS5, depending on the situation of course. Which makes the PS5 perform at the level of 2080/2080S in this game.
Did you do side-by-side comparisons? Because the game consistently runs at 80+fps on PS5.
 
That low? I thought this game was more of an outlier than that in performance terms. I.e. performing very near or even above 2080TI level.

Here's a 2080 at 1440p, Ultra settings, getting around 70fps. Ultra does have an impact over high as well too (not huge, on my rig it's 4-5 fps below high).

Overall last I checked, it was far closer to a 2080ti in this game, but the 2080ti still outperformed it a little. The game also runs exceptionally well on Radeon, in there it shows a 6700xt is basically a match for the 2080ti (!), just losing slightly at 4k. So it's a hog yeah, but at least in terms of Radeon architecture it's not really that exorbitant of an outlier, considering the PS5's GPU is basically a non-XT 6700. It's a little more so on the Nvidia side.
 
Last edited:
After upgrading my lowly Core i7 3770K to a shiny Ryzen 7800X3D, I retested the game on my 2080Ti. At both native 4K and 2K resolution, High settings (same as PS5), I found the game to consistently perform between 10fps to 20fps higher than PS5, depending on the situation of course. Which makes the PS5 perform at the level of 2080/2080S in this game.
Wait... Why are you comparing the GPU on a non equal CPU?
As you say, on the other CPU you did not have these results. So why are you comparing now? A real comparison test would need to have an equivalent CPU.
Who guarantees you that if the PS5 had an equivalent 7800X3D CPU that it's GPU would not give much better results?
When we talk about a system performance, we must remember that it is dependable on the combination of all parts. GPU scaling with CPU is a reality know for ages, so you cannot just increase the CPU on your PC and compare the resulting GPU performance with the PS5. If you want to know which equivalent GPU the PS5 has, you need to place an equivalent CPU on your PC.
 
After upgrading my lowly Core i7 3770K to a shiny Ryzen 7800X3D, I retested the game on my 2080Ti. At both native 4K and 2K resolution, High settings (same as PS5), I found the game to consistently perform between 10fps to 20fps higher than PS5, depending on the situation of course. Which makes the PS5 perform at the level of 2080/2080S in this game.
So a Ryzen 7800X3D + 2080/2080s = 4700s + PS5 GPU? Crazy how the PS5 CPU is punching above its weight! Where do you think it does come from?
 
Did you do side-by-side comparisons? Because the game consistently runs at 80+fps on PS5.
Yeah, I used data from DigitalFoundry and GamingNexus, and compared like for like scenarios (cut scenes and fixed camera scenes).

That low? I thought this game was more of an outlier than that in performance terms. I.e. performing very near or even above 2080TI level.
Overall last I checked, it was far closer to a 2080ti in this game, but the 2080ti still outperformed it a little
The game improved after multiple patches. It's slower than a 2080Ti for sure. In Lost Legacy the PS5 was averaging 40fps at 4K in the woods scene, the 2080Ti was 49fps.


Wait... Why are you comparing the GPU on a non equal CPU?
You can ignore the 1440p results as they could be CPU limited on both machines. However 4K results are valid, as they are GPU limited on the PS5 (unlocked fps in VRR mode), and on the 2080Ti.

So a Ryzen 7800X3D + 2080/2080s = 4700s + PS5 GPU? Crazy how the PS5 CPU is punching above its weight! Where do you think it does come from?
Bad DX12 code optimizations probably. Specifically on the CPU side, it's consistent across games on the same engine.
 
Last edited:
Wait... Why are you comparing the GPU on a non equal CPU?
As you say, on the other CPU you did not have these results. So why are you comparing now? A real comparison test would need to have an equivalent CPU.
Who guarantees you that if the PS5 had an equivalent 7800X3D CPU that it's GPU would not give much better results?
When we talk about a system performance, we must remember that it is dependable on the combination of all parts. GPU scaling with CPU is a reality know for ages, so you cannot just increase the CPU on your PC and compare the resulting GPU performance with the PS5. If you want to know which equivalent GPU the PS5 has, you need to place an equivalent CPU on your PC.

Comparing using non equal CPU's in fully GPU bound scenario's like the PS5's 4k mode is entirely valid. We see from the DF analysis that 1440p 120hz mode that the PS5 CPU can hit generally between 90-110fps while in the much higher resolution 4k fidelity mode the framerates are around 45-50fps. So clearly the GPU is the bottleneck there by a significant degree.

It's not overly applicable to this game but generally I would say console games are more GPU limited than CPU limited particularly where DRS is used which is specifically designed to max out the GPU in any given situation. So as a general rule of thumb where it's not possible to get an exact CPU match (i.e. most of the time), isolating GPU performance with a more powerful CPU if the correct way to go - assuming you're trying to compare GPU performance of course.

So a Ryzen 7800X3D + 2080/2080s = 4700s + PS5 GPU? Crazy how the PS5 CPU is punching above its weight! Where do you think it does come from?

That's not the conclusion that his data was demonstrating. The 7800X3D is simply removing the CPU bottleneck caused by his much older 4 core CPU that would indeed be slower than the PS5 CPU. @DavidGraham noted that his 2080Ti outperforms the PS5 on that CPU which suggests that CPU is not equivalent to the PS5 CPU. However a cursory search of the internet provides far more conclusive evidence that in fact a 7800X3D is enormous overkill if the goal is simply to match PS5 CPU performance. For example we see here in a completely CPU bound scenario that a 5800X can hit over 130fps in the game which is around 30% higher than PS5 achieves in it's 120Hz mode based on the DF analysis of it hitting around 100fps on average (so around 3700X level). A 7800X3D is a much faster CPU than the 5800X.


1706179926252.png
 
When we talk about a system performance, we must remember that it is dependable on the combination of all parts.
No, this is a GPU test and this has been done for ages. You isolate the component you want to test by making it the slowest in the system.

I could throw a Ryzen 3600 with a 4090 in BG3, and I bet you it'd barely outperform the PS5.
 
+++
+Comparing using non equal CPU's in fully GPU bound scenario's like the PS5's 4k mode is entirely valid. We see from the DF analysis that 1440p 120hz mode that the PS5 CPU can hit generally between 90-110fps while in the much higher resolution 4k -+/+
*
fidelity mode the framerates are around 45-50fps. So clearly the GPU is the bottleneck there by a significant degree.

It's not overly applicable to this game but generally I would say console games are more GPU limited than CPU limited particularly where DRS is used which is specifically designed to max out the GPU in any given situation. So as a general rule of thumb where it's not possible to get an exact CPU match (i.e. most of the time), isolating GPU performance with a more powerful CPU if the correct way to go - assuming you're trying to compare GPU performance of course.
\\/
Sorry if this is my ignorance, but how can you tell if the PS5 is GPU limited? Can you ashure that

No, this is a GPU test and this has been done for ages. You isolate the component you want to test by making it the slowest in the system.

I could throw a Ryzen 3600 with a 4090 in BG3, and I bet you it'd barely outperform the PS5.
Not doubting that!

But as you say, a slower CPU will have problems outperforming the PS5.
But if this is true, and a better CPU improves performance, why assume the PS5 is not also CPU limited?
Has someone profiled the CPU usage on the PS5 to garantee the game it is not beeing bottlenecked by the CPU?

That is my doubt... The same Way I can play on a certain CPU with a certain performance, but if I improve the CPU I can get better performance, who is to say the same is not happening on the PS5?
 
But if this is true, and a better CPU improves performance, why assume the PS5 is not also CPU limited?
Because it has the same settings at 4K and its performance drops by half. Resolution is practically a non-factor in CPU scaling, so clearly, the PS5 dropping to 49fps in 4K when it runs at 70fps in the same spot in 1440p has nothing to do with the CPU.
 
So a Ryzen 7800X3D + 2080/2080s = 4700s + PS5 GPU? Crazy how the PS5 CPU is punching above its weight! Where do you think it does come from?
For all the talk about console GPUs "punching above their weight," people often forget about CPU performance, the area where consoles truly shine.
Yeah, I used data from DigitalFoundry and GamingNexus, and compared like for like scenarios (cut scenes and fixed camera scenes).

Interesting. I tested the game not too long ago on a 2080 Ti and the performance didn't appear to have changed. This was about 3-4 months ago though so maybe it has improved.
 
Sorry if this is my ignorance, but how can you tell if the PS5 is GPU limited? Can you ashure that

I'm assuming this is your question? It's incorrectly quoted as me in the post above. As noted by @Below2D we know it's GPU limited in fidelity mode because we know that as resolution (which impacts GPU and not CPU performance) is decreased, the framerate increases significantly. If it were CPU bound at 4k in fidelity mode then decreasing resolution for the various performance modes would have no impact on the framerate.
 
Because it has the same settings at 4K and its performance drops by half. Resolution is practically a non-factor in CPU scaling, so clearly, the PS5 dropping to 49fps in 4K when it runs at 70fps in the same spot in 1440p has nothing to do with the CPU.
Shure. This seems absolutely correct
 
Last edited:
Shure. Absolutely correct... But this does not show that the performance at both resolutions could not improve with a better CPU.

It literally does. If the performance increases massively at a lower resolution, then you are not CPU limited at the higher resolution, especially at 4k.

So, although you are correct, what I'm failing to see is how does this show that the PS5 with a better CPU would not give better results.

It could potentially get better results in the 120fps 1080p Performance+ mode at certain spots, perhaps - we don't have a lower resolution mode to test this out with though (or being able to disable vsync to go beyond 120).

However, the contention here is the Fidelity mode performance relative to a 2080ti, and to a lesser extent the uncapped 1440p performance mode. A faster CPU would definitely not affect the Fidelity mode results, as you're almost always below 60fps - while the same scenes in the 1080p Performance plus modes are pretty much always over 100, and is capped at 120 to boot. That mode in particular may indeed be CPU limited on the PS5 as it can even drop below 120 in spots, but that's not what's being compared here. As the Performance + mode is pretty much universally faster than the 1440p Performance mode, then we can draw the conclusion that the 1440p mode is also not CPU limited, just like the Fidelity mode.
 
Last edited:
I'm assuming this is your question? It's incorrectly quoted as me in the post above. As noted by @Below2D we know it's GPU limited in fidelity mode because we know that as resolution (which impacts GPU and not CPU performance) is decreased, the framerate increases significantly. If it were CPU bound at 4k in fidelity mode then decreasing resolution for the various performance modes would have no impact on the framerate.
I got what you are saying:

The CPU manages the frame rate, so increasing the framerate will have impact on the CPU.

What you are saying is basically that. Because if we were CPU bounded, then in fidelity mode we could not increase fps, even by reducing resolution.

Because more fps requires more CPU and if we were bounded there would be nothing to spare!

Ok, this is logic! I see the point...

But regardless of that, look at these benchmarks:

CP-Ultra-1080p.png

Looking at the 5600 and 5700xt, we see we are GPU bounded.

But on the other cases we are CPU bounded.

Taking the 3090 as example, all CPUs above the 2600X have the performance for 60 fps at 1080p. Just like the PS4 CPU has the performance for 60 fps at 1440!
But does that means we are seeing the full power of the GPU?
No, if we improve the CPU, we get better results?

So, although the logic is sane, it does not answer my question: How do we know what is the GPU equivalent of the PS5 without using an equivalent CPU? Who guarantees us that with a better CPU the PS5 GPU would not fair better?

Due to this, I believe that the only way to see what an equivalent GPU to the one in PS5 is, is to place a similar CPU on a PC, and replace the GPU until we find one that in equal circumstances gives the same results.

Look at this example: PS 5 would be the Ryzen 5 1600x with a 3090.
And you test a PC with a 3070 and a Ryzen 5 5600x.
You would get better results... And then claim the PS5 GPU is bellow te 3070. When in reality, in this example, is a 3090.

See my point? And all because of the CPU.
 
Taking the 3090 as example, all CPUs above the 2600X have the performance for 60 fps at 1080p. Just like the PS4 CPU has the performance for 60 fps at 1440!
But does that means we are seeing the full power of the GPU?
No, if we improve the CPU, we get better results?

You're making this way more complex than it has to be.

If you only have one system to test, you can't determine you're GPU or CPU bound in a test with one resolution or graphics setting used. In the case of the Techspot graph you showed, the other GPU's do the part of changing the GPU load. If you only had a 5700xt to test, you could still determine you were CPU limited on that system by running that same test, but at a lower resolution. If you ran that test at 720p, but only achieved a marginal fps increase, well below the actual GPU load decrease, you would know you were still hampered by your CPU.

GPU/CPU limited just means "what is the bottleneck to achieving a certain level of performance for the desired visual output". You're always limited by both to some degree, it's just what you're trying to achieve in terms of visuals vs. performance.

In the case of the PS5, we can determine at the resolutions of 1440p and 2160p, it is very likely GPU limited. We know this, because the 1080p mode exists, and produces far better framerates! A hypothetical Zen5 would do virtually nothing to increase the framerate in Fidelity mode, because it is entirely held back by the GPU. Fidelity mode is what is being compared to the 2080ti. It is GPU limited on the PS5.

If the equivalent PC GPU were being compared to the results from the 1080p + performance mode only, then yes - that would be an assumption as that mode on the PS5 could be held back by the CPU or the 120fps cap. But it's not, the comparison is being done against Fidelity mode, and the addition of 2 other modes allows us to reduce/increase the GPU load, and correspondingly, the CPU as well to make the determination of where the bottleneck lies at 2160p.
 
@Metal_Spirit You can tell when something is GPU limited by finding drops in performance, either under target 30/60 fps or with unlocked framerate, and seeing if the drops happen at lower resolution modes. CPU isn't affected by resolution (typically). That's why a console like the PS4 Pro, can run games at higher resolution with the same CPU as the PS4.

If you can drop the resolution, and performance increases, you can know that you are GPU bound at those higher resolutions. We can find a "PS5 equivalent" GPU, by measuring a section in which we know the CPU isn't hindering performance.
 
Back
Top