The Last of Us, Part 1 Remaster Remaster [PS5, PC]

I'm working away at the moment so can't test myself but can anyone verify this claim in the new patch please?

  • Improved texture fidelity and resolution on in-game Low and Medium settings

Is there an improvement?

EDIT: I found a video on YouTube and not only have they received a noticeable improvement but the setting also reduces VRAM use by up to 1GB.

Better textures and lower VRAM use? Seems like a winner.
 

Attachments

  • Screenshot 2023-04-27 130820.png
    Screenshot 2023-04-27 130820.png
    1.3 MB · Views: 20
Last edited:
Honestly the only amd cpu I’d touch is the 5800x3d, especially if someone already has a motherboard that supports it. I have a 5600x and sometimes it’s annoyingly limited. I guess it depends on resolution etc.

Oh I agree but the 5800X3D is double the cost if not more, so for a quick upgrade the 5600 is OK.

Although as good as the 5800X3D is I don't think I could swallow dumping £330 on a CPU upgrade for a dead socket when you can move to AM5 for less than £500 and be future proof.
 
Last edited:
I think it may be time for you to bite the bullet and upgrade, even the 5600 would be a decent upgrade in pure gaming workloads.

Yeah I get where you're coming from, and if I do upgrade it'll be to a 5800X3D. I don't like the idea of upgrading just to overcome a hand full of poorly optimised games though. I assume even an X3D will struggle to hit 60fps in Survivor which is insane.

LTOU wise it would let me increase my locked frame rate from 50 to 60fps (or more) which would be nice, but I'm not sure it's worth £300. All my other games run at my target frame rates anyway - at least when DLSS3 is included. Foresaken - based on the demo would be the same as TLOU but it's not something I'm ever going to buy/play anyway.

Still, I continue to be highly tempted and may well take the plunge.
 
Yeah I get where you're coming from, and if I do upgrade it'll be to a 5800X3D. I don't like the idea of upgrading just to overcome a hand full of poorly optimised games though. I assume even an X3D will struggle to hit 60fps in Survivor which is insane.

LTOU wise it would let me increase my locked frame rate from 50 to 60fps (or more) which would be nice, but I'm not sure it's worth £300. All my other games run at my target frame rates anyway - at least when DLSS3 is included. Foresaken - based on the demo would be the same as TLOU but it's not something I'm ever going to buy/play anyway.

Still, I continue to be highly tempted and may well take the plunge.

I've just bought an AM5 set-up, it just made better financial sense to invest in AM5 rather than chuck more money in to socket 1700 as it's a dead socket.

  • Ryzen 5 7600
  • ASUS B650 motherboard
  • 32GB DDR5 6000Mhz

Bundle was less than £500 and factoring in the sale of my exiting Intel 1700 set-up it shouldn't be that bad to make the jump as I was looking at £230-270 for an i7 12700.

And with AM4 prices going down by the day it might also be more cost effective for you to look at doing the same, especially as the 5800X3D is £280+ by itself.
 
I went for the 5700X from my 2600. You get the 8 cores and with PBO it's basically a 5800X.
 
I've just bought an AM5 set-up, it just made better financial sense to invest in AM5 rather than chuck more money in to socket 1700 as it's a dead socket.

  • Ryzen 5 7600
  • ASUS B650 motherboard
  • 32GB DDR5 6000Mhz

Bundle was less than £500 and factoring in the sale of my exiting Intel 1700 set-up it shouldn't be that bad to make the jump as I was looking at £230-270 for an i7 12700.

And with AM4 prices going down by the day it might also be more cost effective for you to look at doing the same, especially as the 5800X3D is £280+ by itself.

Yeah the thing is though if I upgrade to something around either 5800X3D or 7600 performance now, then I don't see myself upgrading again until the start of the next console generation in 2028. Which basically takes me to the end of AM5's lifespan, at which point I may as well wait until AM6.

If you're going to upgrade in the meantime then it's obviously the best route to take, but I try to upgrade my CPU as infrequently as possible.
 
Yeah the thing is though if I upgrade to something around either 5800X3D or 7600 performance now, then I don't see myself upgrading again until the start of the next console generation in 2028. Which basically takes me to the end of AM5's lifespan, at which point I may as well wait until AM6.

If you're going to upgrade in the meantime then it's obviously the best route to take, but I try to upgrade my CPU as infrequently as possible.

But is the 5800X3D going to be enough for the whole generation?

When/if next generation games start to push the CPU's on the consoles at 30fps the 5800X3D isn't far enough a head in terms of IPC to get you to 60fps as you're only looking at an extra ~20% extra in IPC plus the small difference the higher clock speeds add.

That was my concern.
 
Last edited:
Yup 5800x 3d won't cut it to target 60 FPS in the long run, just no freaking way. Look at Jedi, already dropping non 3D Zen 3 CPUs to 45s in basic locations with Low settings. We might keep calling these stuff exceptions but it is... becoming something else.
 
Yup 5800x 3d won't cut it to target 60 FPS in the long run, just no freaking way. Look at Jedi, already dropping non 3D Zen 3 CPUs to 45s in basic locations with Low settings. We might keep calling these stuff exceptions but it is... becoming something else.

It's a shame UE5 was delayed as long as it was, since we are seeing the typical UE4 CPU-limited situations where it's only using 2 CPUs.
 
Yup 5800x 3d won't cut it to target 60 FPS in the long run, just no freaking way. Look at Jedi, already dropping non 3D Zen 3 CPUs to 45s in basic locations with Low settings. We might keep calling these stuff exceptions but it is... becoming something else.

You might be on to something there.

Is it poor game coding or just the natural increase in CPU performance as games move away from the Jaguar cores of last gen.
 
You might be on to something there.

Is it poor game coding or just the natural increase in CPU performance as games move away from the Jaguar cores of last gen.
It's both brother. It's both.

PS4 is just a testament or rather proof that devs can go to extremes and somehow hit 30 FPS targets on a freaking 1.6 GHz Jaguar core. That's just... insane. You can say low draw distance, watered down stuff but... Games like TLOU/Spiderman/RDR2 will still look impressive on that machine.

But for me, peak is this:


Just... how? And look at where we are. Its gotta be both. Devs can easily target 60 FPS with mild optimization effort on the newgen console CPUs, that I'm sure of. But now situation is so dreadful that Microsoft sends out stickers for Redfall boxes to warn people that game won't be 60 FPS at launch. A game that has, much much more mediore physics, graphics, simulation from the above game. I know DICE is a beast but... there has to a be limit to the jankiness. I really hoped 60 FPS would be standard and devs would target that on newgen consoles but sadly, they're slowly moving back to 30 FPS targets... Which is a shame. Jedi can't hit locked 60 on newgen consoles with cutbacks, ranges from 35 to 50 FPS in a clear CPU bottlenecked situation. It is sad, really. Targeting a pure locked 60 FPS on those consoles would give EVERYONE a better experience, from console users to PC folks. We wouldn't need crutches like frame gen even with ray tracing. But now we do. There exists no CPU that can hit reliably 100+ in both TLOU and Jedi. Its just a bad trend in my opinion. These are so strong CPUs, Zen 3 ones I'm talking about. Maybe higher quality graphics (models etc.?) lead to more CPU requirement but I'm not sure. I want to see physics, simulation, ragdoll and such improvements from games if they're going to require more CPU grunt...
 
During the PS3/PS4 era developers spoke a lot about the smart things they were doing to distribute load over many cpu cores, job systems etc. You would think that experience carries over to this generation. Has there been any similar discussions at GDC/Siggraph on the new consoles? Again the main problem here is that there’s nothing happening in these games to explain the high CPU usage.
 
Yup 5800x 3d won't cut it to target 60 FPS in the long run, just no freaking way. Look at Jedi, already dropping non 3D Zen 3 CPUs to 45s in basic locations with Low settings. We might keep calling these stuff exceptions but it is... becoming something else.
That's a terrible example though, the game runs horribly on anything on PC, with CPU and GPU utilization very low. It's horrible UE4 limitations with the developer trying to shove way too much through a couple of threads that the engine supports.
 
But is the 5800X3D going to be enough for the whole generation?

When/if next generation games start to push the CPU's on the consoles at 30fps the 5800X3D isn't far enough a head in terms of IPC to get you to 60fps as you're only looking at an extra ~20% extra in IPC plus the small difference the higher clock speeds add.

It should be plenty to play out the generation if we define that as no game being unplayable on it. I agree it's not going to double console framerates in many cases, but then neither is my 4070Ti when we account for the higher graphics settings and RT that will likely be gracing the PC versions of those games. If a game is stuck at 30fps on a console, then I'm happy enough with 40-50fps on PC tbh and then just use the spare GPU power for nicer graphics. 60fps is just a nice to have for me, before I got the 4070Ti, I basically played all games at sub 60fps.

Yup 5800x 3d won't cut it to target 60 FPS in the long run, just no freaking way. Look at Jedi, already dropping non 3D Zen 3 CPUs to 45s in basic locations with Low settings. We might keep calling these stuff exceptions but it is... becoming something else.

I agree on the 60fps point, but not the Jedi Survivor point. That game really is an exception, and one that will very likely be patched to resolve that particular issue - even the devs have admitted the game is coming in hot, and PC naturally gets the short end of the stick when that's the case given the greater effort required to optimise for it. We really shouldn't be upgrading our PC's to brute force past a handful of horribly optimised games, especially when said games are often improved weeks or months after their launch. We should simply not buy those games, at least not until they're fixed.

I'm afraid if it were to become the norm that you need 5800X3D level or greater performance to simply match console frame rates because all games are so horribly optimised, then having a future proof platform would be irrelevant to me, because I would most certainly be exiting the scene.
 
@trinibwoy I think a few things have happened. Games have become more complex and jobs systems become complicated because of dependencies between jobs. Scheduling the jobs, I guess, can be the problem. You end up with a critical path that slows everything down. On top of that the threading model on pc is different than on console. Also a lot of console games in the PS360 era ran at 30 fps. Getting the cpu code to run fast enough to keep a solid 30 fps is a lot different task than modern expectations. I would guess that a lot of these modern games run fine on old pcpu hardware if all you expect is to be able to stay close to 60fps, but they don't scale well to new hardware that should run much faster.
 
@DavidGraham That's pretty much the most important fix in my eyes. Now there are reasonable ways to scale the game on lower end cards. Shouldn't be too crazy for anyone to find a playable setting now.
 
Back
Top