The Last of Us, Part 1 Remaster Remaster [PS5, PC]

Looks like it performs just as bad as Uncharted 4 if these specs are accurate. Perhaps something about the ND engine just requires too much effort to be worth changing to ensure good PC performance. Nixxes did Returnal so I’m assuming this was done by someone else?
Nixxes didn't do Returnal, Climax Studios did.

TLOU P1 is seemingly done completely by Naughty Dog internally while the Uncharted: Legacy of Thieves Collection was done by Iron Galaxy alongside Naughty Dog.
 
I still cant believe PC gamers take these 'requirements' seriously. I mean, I am genuinely bewildered at how every single time people take them as gospel and then every single time they are not actually accurate, nearly without exception.

That is simply not true.

God of War's initial chart was accurate, it was just that Ultra shadows were very demanding. Uncharted's was also accurate. Horizon's were actually undersold, it required more GPU grunt than the initial chart indicated.

Like, why is there a higher CPU requirement to go from 1080p/60fps to 1440p/60fps with the same settings? Come on now folks, these are always just rough guesses by the developers. They dont have the time or desire to actually buy a giant benchmarking setup to test all this stuff out.

Nobody is even noting the CPU requirements though, who cares. The few posts have been focusing on the GPU requirements.

EDIT: These requirements also suggest you NEED an SSD to run it at all. Y'all really think that'll pan out?

Why not? It's a PS5 only title. Sure you may be able to install it on a HDD, but you may also get a buttload of stutter if you try to run it from there. Might as well just list an SSD as a requirement, hardly outlandish in 2023.
 
The TV series is enjoyable. I know there's stuff they altered, but it's still great.

By the way, the RE4 demo is dropping tonight. So if you enjoy both franchises, we're all winners here. :)
 
For those complaining that the PC requirements are too high to match PS5 I want to inform you that every PS5 has a particle of Cerny's DNA to boost its power. Thats why there were production shortages. Too many units and Mark would have vanished. Until Sony managed to increase its Cerny DNA output via stem cells growing on petri dishes and solved the supply issues
 
I still cant believe PC gamers take these 'requirements' seriously. I mean, I am genuinely bewildered at how every single time people take them as gospel and then every single time they are not actually accurate, nearly without exception.

Like, why is there a higher CPU requirement to go from 1080p/60fps to 1440p/60fps with the same settings? Come on now folks, these are always just rough guesses by the developers. They dont have the time or desire to actually buy a giant benchmarking setup to test all this stuff out.

EDIT: These requirements also suggest you NEED an SSD to run it at all. Y'all really think that'll pan out?

EDIT2: Ha, they even suggest you need a 5800XT to run it at 1080/60fps/High. Except ya know, no such GPU even exists. If you really need more proof they are just kind of winging this stuff...
5800xt was a typo of course. The Uncharted 4 specs in terms of GPU turned out to be dead on. That game requires substantially more GPU power to match/exceed PS4 and PS5.
 
What is important here is what settings PS5 uses? If devs do not give a preset called "original" directly, we cannot know what settings they use unless DF and others analyze it side by side. Even then, its still a guesswork and there could be settings that are not even present on PC (similar to Witcher 3 and its ray tracing settings on consoles). I really wish more devs opted for "console preset" so people like me can set and forget.
 
5800xt was a typo of course. The Uncharted 4 specs in terms of GPU turned out to be dead on. That game requires substantially more GPU power to match/exceed PS4 and PS5.

I'm wasn't that huge in the PS5's case.

NXG's head to head at matched settings showed the 6800 performing about 20-30% faster in the same area if I recall correctly which isn't as fast as one would expect, but it's not hugely out of expectations.
 
I'm wasn't that huge in the PS5's case.

NXG's head to head at matched settings showed the 6800 performing about 20-30% faster in the same area if I recall correctly which isn't as fast as one would expect, but it's not hugely out of expectations.
6800 should be performing 50-80% faster though.
 
6800 should be performing 50-80% faster though.

Not really. The 6800 is rated 45% faster on average than the 6600xt according to TPU. I rewatched the NXG video and he said the 6800 was about 24% faster than PS5 at Ultra settings. PS5 doesn't run at Ultra settings so at matched settings it's likely more like 30-35% faster vs a 45% target. Not ideal, but not wildy out of whack.
 
That is simply not true.

God of War's initial chart was accurate, it was just that Ultra shadows were very demanding. Uncharted's was also accurate. Horizon's were actually undersold, it required more GPU grunt than the initial chart indicated.



Nobody is even noting the CPU requirements though, who cares. The few posts have been focusing on the GPU requirements.



Why not? It's a PS5 only title. Sure you may be able to install it on a HDD, but you may also get a buttload of stutter if you try to run it from there. Might as well just list an SSD as a requirement, hardly outlandish in 2023.
They weren't dead on, they were just in the vague region of accurate, which isn't hard to achieve with some basic guesswork. Heck, I could look at a game and probably make some roughly educated guesses as to what will do what. And who cares about CPU requirements? The point is simply to prove that these specs are not actually being thoroughly tested and shouldn't be taken so seriously like everybody seems to always do. Also using the example of Horizon not being accurate to support the idea that they actually are accurate is a weird strategy.

As for this game being PS5-only, it is, but it's also clearly not exactly a 'built from the ground up for next game' either. It mainly takes the original up to the standards of TLOU2, and maybe just a bit more, but not much. Of course, whether it actually requires an SSD or not could still very much depend on how the whole memory management side is handled regardless, but I'd still be surprised if this port is one of the first AAA games that truly demands an SSD even for 720p/Low settings. And of course SSD minimum requirements like this will come, I've been adamant about that more than most, but we'll see on this one.
 
Last edited:
Not really. The 6800 is rated 45% faster on average than the 6600xt according to TPU. I rewatched the NXG video and he said the 6800 was about 24% faster than PS5 at Ultra settings. PS5 doesn't run at Ultra settings so at matched settings it's likely more like 30-35% faster vs a 45% target. Not ideal, but not wildy out of whack.
I just checked TPU and the 6800 is 83% faster at 4k and 61% faster at 1400p.
 
I just checked TPU and the 6800 is 83% faster at 4k and 61% faster at 1400p.
Fair enough. I used the GPU database relative performance table which gives a very different result and makes me wonder about the accuracy of that table in general now.

EDIT: that said, it's unlikely the 6600XT is a match for the PS5 at the memory bandwidth intensive 4k as it has much less main memory bandwidth, and given its infinity cache is much smaller than the 6800's, the 6800 should fair much better at that resolution comparatively. At 1440p which puts less pressure on bandwidth, the individual card reviews I'm seeing at TPU are coming in around the 50% faster margin.

EDIT 2: Wait, were you using their actual Uncharted 4 benchmarks? If so there is no 6800 in there. I'm not sure what you were comparing to but in the actual Uncharted 4 test below, the 6800XT is only 90% faster at 4k, 59% faster at 1440p, and 40.5% faster at 1080p. Obviously the 6800 will be a fair bit slower than that, and as we can see by those figures, the performance advantage opens up hugely with higher resolutions. From the GPU performance scaling we're seeing even at 1080p, we can't put that all on CPU limitations and so the obvious culprit will be memory bandwidth/infinity cache size which when stressed is likely going to be the area the 6600XT falls behind the PS5.

 
Last edited:
6600xt is definitely not a match for PS5 with bandwidth limited operations. I can even say gap is quite big. PS5 is able to get native 4K+ 40 FPS+ average with ray tracing in Spider-man. My friend with his 6600xt tried the exact same preset DF shared and he got like a horrendous 25-30 framerate experience. GPU were kind of "came to its senses" at 1440p and only shined at 1080p where it was shooting comfortably above 50 FPS then.

Its box literally says its a 1080p card. PS5 has the grunt and bandwidth to deal with 4K/upscaling. Even 4K/upscaling requires beefy bandwidth. 6600xt does not have any of it. Infinity cache only really is helpful at lower resolutions / no ray tracing situations.

You can literally see this behaviour if you compare 1440p/4K/1080P benchmarks 6600xt has compared to the likes of 3060/3060ti/5700xt (yes). In games where it beats 5700xt at 1080p, it often loses to the very same GPU at 4K.
 
They weren't dead on,

Never said they were (that was Techuse), whatever that means regardless. No recommended specs can ever be 'dead on'. I'm saying with respect to GPU power, these recommended spec charts are more accurate than not when we look at the performance of the games once released vs their earlier recommended chart. Can you point to one of these recommended spec releases where the GPU power recommended for fps/res was wildly overestimated?

they were just in the vague region of accurate, which isn't hard to achieve with some basic guesswork.

So then what's the problem? You're the one expressing bewilderment that people here are actually taking these released specs are representative of how GPU hungry the game will be. From past history, especially when comparing the only other ported game built on a Naughty Dog engine, there's more reason to believe these will be closer to the actual GPU demand of the game than not.

And who cares about CPU requirements? The point is simply to prove that these specs are not actually being thoroughly tested and shouldn't be taken so seriously like everybody seems to always do.

"Like everybody" - but you're posting this in a thread where nobody here is expressing shock/surprise at CPU requirements. Everyone here is talking about the GPU.

Also using the example of Horizon not being accurate to support the idea that they actually are accurate is a weird strategy.

There isn't a 'strategy' here, there's just recalling actual history. You're coming into a thread where some have expressed a little surprise/concern about the GPU requirements being relatively 'high', and effectively told everyone, hey- calm tf down, these specs are usually nonsense and there's no reason for concern.

Your initial post doesn't make any sense in that context if you're actually arguing that the release performance could actually be more demanding than these recommended specs indicate, so it's pretty clear your angle here is to assuage the (tepid) concern by dismissing every aspect of the recommend specs as Sony just pulling them out of their ass. So Horizon (and I would argue, Spiderman to an extent on the lower-end recommended specs) coming in with being more demanding at the actual release is perfectly applicable to my argument; these recommended spec charts are not exorbitantly overshooting the required hardware. With respect to GPU power, on the whole, they have a history of being pretty accurate.
 
Last edited:
Fair enough. I used the GPU database relative performance table which gives a very different result and makes me wonder about the accuracy of that table in general now.

EDIT: that said, it's unlikely the 6600XT is a match for the PS5 at the memory bandwidth intensive 4k as it has much less main memory bandwidth, and given its infinity cache is much smaller than the 6800's, the 6800 should fair much better at that resolution comparatively. At 1440p which puts less pressure on bandwidth, the individual card reviews I'm seeing at TPU are coming in around the 50% faster margin.

EDIT 2: Wait, were you using their actual Uncharted 4 benchmarks? If so there is no 6800 in there. I'm not sure what you were comparing to but in the actual Uncharted 4 test below, the 6800XT is only 90% faster at 4k, 59% faster at 1440p, and 40.5% faster at 1080p. Obviously the 6800 will be a fair bit slower than that, and as we can see by those figures, the performance advantage opens up hugely with higher resolutions. From the GPU performance scaling we're seeing even at 1080p, we can't put that all on CPU limitations and so the obvious culprit will be memory bandwidth/infinity cache size which when stressed is likely going to be the area the 6600XT falls behind the PS5.

I used the performance summary from the most recent GPU review they published.
 
6600xt is definitely not a match for PS5 with bandwidth limited operations. I can even say gap is quite big. PS5 is able to get native 4K+ 40 FPS+ average with ray tracing in Spider-man. My friend with his 6600xt tried the exact same preset DF shared and he got like a horrendous 25-30 framerate experience. GPU were kind of "came to its senses" at 1440p and only shined at 1080p where it was shooting comfortably above 50 FPS then.

Its box literally says its a 1080p card. PS5 has the grunt and bandwidth to deal with 4K/upscaling. Even 4K/upscaling requires beefy bandwidth. 6600xt does not have any of it. Infinity cache only really is helpful at lower resolutions / no ray tracing situations.

You can literally see this behaviour if you compare 1440p/4K/1080P benchmarks 6600xt has compared to the likes of 3060/3060ti/5700xt (yes). In games where it beats 5700xt at 1080p, it often loses to the very same GPU at 4K.
So basically what your saying is, GPU theoretical power alone as expected doesn't tell the whole story about performance. Should be obvious to anyone really

It's not that PS5 gpu is stronger than the 6600xt, but the rest of the system is allowing the PS5 gpu to use much more of it's peak theoretical power due to having more ram(12 vs 8) and more bandwidth. 6600xt has close bandwidth as PS5 but only using embedded ram, PS5 has that level of bandwidth just by default
 
So basically what your saying is, GPU theoretical power alone as expected doesn't tell the whole story about performance. Should be obvious to anyone really

It's not that PS5 gpu is stronger than the 6600xt, but the rest of the system is allowing the PS5 gpu to use much more of it's peak theoretical power due to having more ram(12 vs 8) and more bandwidth. 6600xt has close bandwidth as PS5 but only using embedded ram, PS5 has that level of bandwidth just by default
Obvious to some, not so obvious to others. I keep seeing people on various forums on insisting PS5 has special sauce magic because 6600xt falters compared to it despite being "teh 10 TFLOPS RDNA2 GPU".

RX 6500xt is the same story. Outside, its a 5.7 TFLOPS RDNA2 monster. Inside, it has so pathetic bandwidth I'm not sure if it can keep up with 4 TF Series S. I'd like to see if it can do 720p/60 FPS ray tracing in Metro Exodus EE for example. I bet it won't be able to do it.
 
Obvious to some, not so obvious to others. I keep seeing people on various forums on insisting PS5 has special sauce magic because 6600xt falters compared to it despite being "teh 10 TFLOPS RDNA2 GPU".

RX 6500xt is the same story. Outside, its a 5.7 TFLOPS RDNA2 monster. Inside, it has so pathetic bandwidth I'm not sure if it can keep up with 4 TF Series S. I'd like to see if it can do 720p/60 FPS ray tracing in Metro Exodus EE for example. I bet it won't be able to do it.
Though be fair, the PS5 is a pretty well designed system 🤔 it won't be the best it can be though until it gets onto a smaller process node. That size is just ridiculous 😆

I think on the contrary gpu power is wasted for some reason by and and Nvidia with their own products due to weak bandwidth or low memory allocation for the products. I wonder why they do that
 
Back
Top