The Last of Us, Part 1 Remaster Remaster [PS5, PC]

What a plot twist. This is shaping up to become one of the best PC ports.

It's definitely better and I like a lot of their approaches to things and exposing this control to the end user, but admittedly it really had nowhere to go but up. :)

The CPU performance has improved quite a bit no doubt, now my 12400f is not preventing me from reaching 60fps 99% of the time, albeit there are still some 1-2 frame stutters sometimes entering new areas. But they're not like the hitches you get with other games and say, shader stutters, they literally are 1-2 fps and almost hard to catch without looking at a frametime graph.

Outside of vram limited situations though, GPU performance has barely improved at all since launch. Now thanks to DLSS mind you, on cards like a 4070 you can prob run 4k high with DLSS Quality for a good 60fps, so overall you likely will get a noticeably better experience than the PS5 version, so for mid-high tier + cards it's definitely a solid 'buy' now I'd say.

However that's due to DLSS, which works well enough at a 4k output res, but has troubles at lower resolutions and settings. On my 3060 at 1440p with DLSS performance, I'm still GPU limited in several areas from reaching a solid 60fps with just the high preset, and DLSS performance at that res can produce some pretty awful artifacts in certain lighting conditions. 1080p with no upscaling performs slightly worse. So from a pure rasterization performance perspective, the game is still wildly out of sync - to reach the PS5's unlocked framerate at 1440p without upscaling you still need something on the order of a 3080+ class card.

Still, the CPU bottlenecks were the most pressing concern after the vram issues, so that's 2 out of 3. If you're CPU bottlenecked your options are...buy a new CPU/motherboard. For GPU, simply lower the res/settings or use DLSS/FSR. I'd like to see that improved sure, but now with 6-core budget CPU's able to handle the game, it's viable on a far wider range of systems. You may just have to run at a lower res than you're used to.
 
Last edited:
So from a pure rasterization performance perspective, the game is still wildly out of sync - to reach the PS5's unlocked framerate at 1440p without upscaling you still need something on the order of a 3080+ class card.

It is worth remembering that TLOU is a port of a highly tuned and optimised PS5 exclusive from one of Sony's most talented developers.

So PS5's GPU will be punching above its weight compared to your average third party game.
 
It is worth remembering that TLOU is a port of a highly tuned and optimised PS5 exclusive from one of Sony's most talented developers.

So PS5's GPU will be punching above its weight compared to your average third party game.

But that simply means it's not as well, or even close to as well optimised for the PC, which in turn precludes it from being considered a particularly good port.

It's certainly possible to highly optimise games for the PC despite the higher level API's and varied configs, but this feels more like a game that's highly optimised for the PS5, with only minimal work being done to convert that console optimised game code to something more suited to the PC.

I understand from a business perspective why re-engineering the game engine to take better advantage of an entirely different architecture isn't practical on a several years old game, but even so, given that hasn't been done, I don't think the game can be considered an unambiguously good port (years old or not, if PC gamers are going to spend the same money on the game as console gamers then they have a right to expect a similar level of optimisation and polish). Although perhaps it can now be considered a competent port if considered in the context of the time and financial resources the team were likely given to deliver it.
 
But that simply means it's not as well, or even close to as well optimised for the PC, which in turn precludes it from being considered a particularly good port.

Has there ever been a PC port that's had the same level of fine tuning and optimisation that a box containing a fixed set of hardware has?

I think the game is well optimised for PC as GPU's are performing where they should be.

A 6650XT is where I feel PS5's GPU comes in at on your average game but in TLOU it's not unreasonable to say it's at 6700 level due to the extra optimisation it got from being a PS5 'exclusive'

PS5 is native 1440p/60fps with high settings and if you look at PC GPU benchmarks for GPU's around it's level they're not actually that far off PS5.

I'm expecting Ratchet & Clank to be no different when that finally hits on PC as there's only much you can do on PC to optimise.
 
PS5 is native 1440p/60fps with high settings and if you look at PC GPU benchmarks for GPU's around it's level they're not actually that far off PS5.

There are some extremely rare dips below 60fps as DF's showed with TLOU on the PS5, but it also has an unlocked mode which shows the majority of it is well over 60fps. No, it's definitely quite far off from the PS5 in GPU performance. You really need a 3080-class GPU to compete without using upscaling.

1686835581597.png

A 3060ti at 1080p high can still have drops under 60fps. Even if the PS5 was limited to 60fps at 1440p, that would still be a massive performance disparity from virtually every other game - save, perhaps not surprisingly, from Uncharted. That's getting into 2X 3060ti's performance here.

I'm expecting Ratchet & Clank to be no different when that finally hits on PC as there's only much you can do on PC to optimise.

Maybe. It is Nixxes, and while I don't regard them as miracle workers, they have more experience with Insomniac's engine in particular, which hasn't shown the GPU performance discrepancy in the other games with this engine that TLOU/Uncharted has. Very different game and a whole host of other challenges so we'll see.

I don't expect the 2070 Super to be the defacto comparison card for every game going forward during the entirety of the PS5's lifespan (even outside of VRAM limitations), but if TLOU's GPU performance profile becomes the norm, that would be...not good. It would speak quite poorly of the PC's value proposition if you needed 3080-class cards to compete in most ports, as opposed to them blowing past the PS5 in 99% of games today.
 
Last edited:
First, please forgive the high luminance or brightness look exhibited in the images, as converting HDR images (from jxr to jpg) can be a bit problematic. So, I’ve included a zip file for those wanting to see the original HDR images when using a proper image viewer like Windows Photos or HDR+WCG Viewer.

Anyhow, I wanted to see how much of an image and performance difference there was with the new patch, if any, between the game’s default resolution render and Nvidia’s DLSS Super Resolution render. As you will see below, the game's default resolution render uses 15.79% more memory on average when compared to Nvidia’s DLSS (which rarely goes over 11GB, making it perfect for those 12GB graphic cards owners). The image quality differences are barely perceptible at times, however, DLSS does have a slight edge when looking at the brick patterns or delineated borders in set 4 images, or the cleaner image pattern in the overhead fluorescent light modules in set 6 images. And of course, DLSS has the edge in performance, with a lower GPU utilization to boot.

Set 1: Default vs DLSS
PwTmjOy.png

nzIT048.png


Set 2: Default vs DLSS
0N0wvM7.jpg

YATSnwZ.jpg


Set 3: Default vs DLSS
8FNgTVW.jpg

2zVqzsg.jpg


Set 4: Default vs DLSS
J5pbl0K.jpg

kkCBb2k.jpg


Set 5: Default vs DLSS
pRyi6h9.jpg

6Zpjz8X.jpg


Set 6: Default vs DLSS
s28y7PA.jpg

CgaFsju.jpg
 
There are some extremely rare dips below 60fps as DF's showed with TLOU on the PS5, but it also has an unlocked mode which shows the majority of it is well over 60fps. No, it's definitely quite far off from the PS5 in GPU performance. You really need a 3080-class GPU to compete without using upscaling.

A 3060ti at 1080p high can still have drops under 60fps. Even if the PS5 was limited to 60fps at 1440p, that would still be a massive performance disparity from virtually every other game - save, perhaps not surprisingly, from Uncharted. That's getting into 2X 3060ti's performance here.

I don't expect the 2070 Super to be the defacto comparison card for every game going forward during the entirety of the PS5's lifespan (even outside of VRAM limitations), but if TLOU's GPU performance profile becomes the norm, that would be...not good. It would speak quite poorly of the PC's value proposition if you needed 3080-class cards to compete in most ports, as opposed to them blowing past the PS5 in 99% of games today.

I don't see the point in using this 2 month old video as the data is so out of date now it doesn't represent the game any more.
 
First, please forgive the high luminance or brightness look exhibited in the images, as converting HDR images (from jxr to jpg) can be a bit problematic. So, I’ve included a zip file for those wanting to see the original HDR images when using a proper image viewer like Windows Photos or HDR+WCG Viewer.

Anyhow, I wanted to see how much of an image and performance difference there was with the new patch, if any, between the game’s default resolution render and Nvidia’s DLSS Super Resolution render. As you will see below, the game's default resolution render uses 15.79% more memory on average when compared to Nvidia’s DLSS (which rarely goes over 11GB, making it perfect for those 12GB graphic cards owners). The image quality differences are barely perceptible at times, however, DLSS does have a slight edge when looking at the brick patterns or delineated borders in set 4 images, or the cleaner image pattern in the overhead fluorescent light modules in set 6 images. And of course, DLSS has the edge in performance, with a lower GPU utilization to boot.

That's quite a low CPU clock by todays standards, what CPU is that?
 
I don't see the point in using this 2 month old video as the data is so out of date now it doesn't represent the game any more.

The other link I provided showing the 3060ti failing to hold 60fps at 1080p was done with most recent patch.

Regardless, as I said in the post you replied to and others have also confirmed, GPU performance was not affected by this patch, that aspect of the games performance is what we're discussing - your comment that PC GPU's were actually 'not far off' from the PS5' in this game was what I was replying to. That's just not the case.

As such, even though it was from the initial launch, the performance discrepancy in GPU limited scenarios between the PC and the PS5 has not changed to any meaningful degree since Alex's first look, so that video is perfectly valid for what we're discussing. The improvements have been in texture streaming, bugs, and CPU usage. I linked to that specific section of the video where Alex talked about the GPU performance disparity and showed the screenshot from that section highlighting the rendering bottleneck specifically for that reason - that's not a CPU-limited scenario. That gulf has not changed, it's a GPU hog just as it was Day 1, drastically out of sync with other releases in this aspect.

The image quality differences are barely perceptible at times, however, DLSS does have a slight edge when looking at the brick patterns or delineated borders in set 4 images, or the cleaner image pattern in the overhead fluorescent light modules in set 6 images. And of course, DLSS has the edge in performance, with a lower GPU utilization to boot.

DLSS Quality mode at 1440p is certainly better than DLSS performance at 1440p which is what I have to resort to currently, but daytime scenarios with still images do not exactly give the most thorough picture of how it behaves in-game - if those were emblematic of how DLSS performed in this game at sub-4k output resolutions, it would be a great implementation.

The problems occur at night, specifically the post-process effect of the flashlight hotspot can cause some significant artifacting on some surfaces that really jump out - and you're using the flashlight quite a lot in this game. As you up the DLSS modes and output resolution these artifacts diminish, but they're always present to some degree over native res.
 
Last edited:
The other link I provided showing the 3060ti failing to hold 60fps at 1080p was done with most recent patch.

Regardless, as I said in the post you replied to and others have also confirmed, GPU performance was not affected by this patch, that aspect of the games performance is what we're discussing - your comment that PC GPU's were actually 'not far off' from the PS5' in this game was what I was replying to. That's just not the case.

As such, even though it was from the initial launch, the performance discrepancy in GPU limited scenarios between the PC and the PS5 has not changed to any meaningful degree since Alex's first look, so that video is perfectly valid for what we're discussing. The improvements have been in texture streaming, bugs, and CPU usage. I linked to that specific section of the video where Alex talked about the GPU performance disparity and showed the screenshot from that section highlighting the rendering bottleneck specifically for that reason - that's not a CPU-limited scenario. That gulf has not changed, it's a GPU hog just as it was Day 1, drastically out of sync with other releases in this aspect.



DLSS Quality mode at 1440p is certainly better than DLSS performance at 1440p which is what I have to resort to currently, but daytime scenarios with still images do not exactly give the most thorough picture of how it behaves in-game - if those were emblematic of how DLSS performed in this game at sub-4k output resolutions, it would be a great implementation.

The problems occur at night, specifically the post-process effect of the flashlight hotspot can cause some significant artifacting on some surfaces that really jump out - and you're using the flashlight quite a lot in this game. As you up the DLSS modes and output resolution these artifacts diminish, but they're always present to some degree over native res.

This is a good video, 70% improvement with optimised settings -

While it's somewhat agreed PS5 is using 'high' across the board I've never actually seen a comparison video of settings.
 
Redownloaded the game as I uninstalled after finishing it for the second time.

Loads better, always over 100fps at high settings and 90% of the time now it's 120fps locked and if it does drop fps it's CPU related. The game was still compiling as I couldn't be bothered to wait so I should gain a few more fps once that's complete a few more CPU cycles get freed up.

My 4070ti is easily offering double what PS5 offers and a few more patches to improve CPU performance and I should be at 120fps all the way through.
 
Rumor
I don't feel like I should be the one to create a new thread about this but am I the only one who thinks they should save that ammo for the PS6. I think the PS6 would be too early for that but they could go back and give Jak and Daxter the upgrade that I'm sure people think it deserves. Imagine what a studio like Naughty Dog could do with Jak and Daxter and the PS5. That is just me though.
 
Last edited:
Yeah I figured that would eventually be coming, explains the PC delay of TLOU2 port. No reason not to milk it with how well the TV series did. I suspect it will still release before Season 2 as that's not coming until 2025 though.
 
What areas are they even going to remaster? They should Remake Uncharted 1-3 on TLOU engine and have Nixxes port them to PC.
 
It's in a lot of games, it just gets more attention because it's TLOU.

Quantum Break has stuff like this.

I also think people don't go and mess around in game worlds like they used to do to find stuff like this.

Days Gone has some good physics, Far Cry games.....
 
Last edited:
Back
Top