The Last of Us, Part 1 Remaster Remaster [PS5, PC]

This discussion we're having is making me think twice about going to 4k with my 4070ti.

The 4k monitor I ordered and arrived in the state it did (see attached) so maybe that's fate telling me to stick to 1440p.

Seems like this would depend on expectations. I don't feel 4k60 "max settings" as a forward outlook in general is possible on anything other than the 4090 without factoring in reconstruction/frame generation. Reviews still have a lot of essentially legacy type titles that are not going to be indicative of 4k60 performance of forward titles. If you only leave in the heavier forward type titles I think only the 4090 actually scores above 60 fps rather consistently without relying on reconstruction/frame generation, even then there is still some misses.

My feeling in general has always been what AI upscaling really allows is added flexibility to decouple monitor resolution choice from GPU choice much like before we moved to fixed pixel displays from CRTs.

On a side note I have the opposite issue kind of. I'm thinking of pairing a 4070 to run at 1080p. My 1440p170 monitor had an expected ship date a month ago, but seems to be out of stock everywhere, so I'm not sure if it will ever ship. Might need to go with 1080p240 instead.
 
Seems like this would depend on expectations. I don't feel 4k60 "max settings" as a forward outlook in general is possible on anything other than the 4090 without factoring in reconstruction/frame generation. Reviews still have a lot of essentially legacy type titles that are not going to be indicative of 4k60 performance of forward titles. If you only leave in the heavier forward type titles I think only the 4090 actually scores above 60 fps rather consistently without relying on reconstruction/frame generation, even then there is still some misses.

My feeling in general has always been what AI upscaling really allows is added flexibility to decouple monitor resolution choice from GPU choice much like before we moved to fixed pixel displays from CRTs.

On a side note I have the opposite issue kind of. I'm thinking of pairing a 4070 to run at 1080p. My 1440p170 monitor had an expected ship date a month ago, but seems to be out of stock everywhere, so I'm not sure if it will ever ship. Might need to go with 1080p240 instead.

I can't use 1080p as I don't think it's a high enough resolution to really do modern games (And especially future games) that have a lot of high frequency detail any justice.

In regard as to what GPU is needed for 4k60 my 4070ti will be perfectly OK, it runs current games at native 1440p 60fps with max settings and I have no problems with DLSS.
 
Last edited:

Attachments

  • Bottom floor.png
    Bottom floor.png
    2.6 MB · Views: 10
  • Top floor.png
    Top floor.png
    2.8 MB · Views: 10
It looks like they issue drawcalls for the UI despite it being invisible
Not the biggest issue by any means, but more wasted GPU time :p

On some platforms this can waste significant amounts of bandwidth though. I suspect this is not unique to the PC port, but they do the same on console too.
so when devs dont fix this stuff is it because they dont have time to do it, they dont know about the tool or they dont care?
 
so when devs dont fix this stuff is it because they dont have time to do it, they dont know about the tool or they dont care?
Probably barely makes a difference for the frame so they won't bother.

And usually UI is set up by devs/artists that don't know how to use GPU profiling/debugging tools, I've definitely seen similar in other games.
 
So I have the game running on my laptop 2060 now and I'm pleasantly surprised. It works far better than I have expected.

With DLSS Performance, 1080p and medium settings, 60 FPS seems possible. I was running it at 1440p30 with DLSS Performance and at high and it looks truly amazing.

Sadly I got into VRAM constraints after entering that palace with Tess and Ellie. So I am now running it at 2240x1207 or something, which reduces VRAM usage quite drastically. That's when I noticed something.

The game actually has a texture level that is a perfect fit for 6 and 8 GB GPUs. If you run high texture quality at 2240x1207 or 1080p, the textures will get a ever so slight downgrade compared to 1440p (no, this is not due to resolution, the texture quality itself does indeed change). This is evidenced by the massive decrease in the VRAM usage counter. The texture work is still awesome and far, far better than what we have at medium settings, but some textures are not quite as good as high and at 1440p.

So basically, when you run high textures at 1080p or below 1440p, you will get the true medium texture quality the game has to offer.
 
15GB VRAM at 4k, the game is only 80GB meaning there's 20% of the game in VRAM at any given time 😂

And that's not even counting system RAM, with that it could be 30% of the game in memory at any given time.

But that 80gb you're quoting is compressed data size on disk. So the data in RAM/VRAM native will be perhaps twice as large.
 
Granted, this isn't a like for like scene comparison and I'm only estimating the PS5 runs at the equivalent of PC High settings

I'm not sure why you're saying this, based on DF video the PS5 is running ultra settings in most settings, even going beyond PC ultra with shadow quality. At 1440p, the PS5 is well into the 80fps range, handily outpacing every gpu you listed, including 3080 and 6800.

And it might not be uncompressed in RAM either.

Ok but this is what you said

15GB VRAM at 4k, the game is only 80GB meaning there's 20% of the game in VRAM at any given time 😂

So no the game isn't only 80gb that's just the size of your install on disk. That is why the cpu is working it's tail off to decompress. data is in native format when ready to use in video memory. It has to be. You are overstating the % of game data in VRAM.
 
designed for PS3, ported to PS4, ported with some redesign used in TLOU2 to and add some PS5 features in the mix, then ported to PC by another dev.
What could go wrong.
 
I'm not sure why you're saying this, based on DF video the PS5 is running ultra settings in most settings, even going beyond PC ultra with shadow quality. At 1440p, the PS5 is well into the 80fps range, handily outpacing every gpu you listed, including 3080 and 6800.
The DF video did not do a full set of comparisons so that comment about PS5 running ultra settings is false.

They even showed a part of the video where shadows on Ultra looked slightly better than PS5.
 
PS5 is using high settings not ultra. Ultra and high settings aren't looking different using human eyes.

It's not ignorant, it's factual.

The game is 80GB and it's entirely possible that it's keeping a lot of that data still compressed in system RAM so your point is moot.

I think for texture problem they need to use less OS + App reservation, it will probably solve the problem. This is not using Direct Storage, it means data is uncompressed in VRAM after data is probably partially compressed in System RAM.
 
Last edited:
PS5 is using high settings not ultra. Ultra and high settings aren't looking different using human eyes.



I think for texture problem they need to use less OS + App reservation, it will probably solve the problem. This is not using Direct Storage, it means data is uncompressed in VRAM after data is probably partially compressed in System RAM.
They're not using anything for OS+App. It is a fake, irrevelant, just visual thing they put up there. Game will happily use upwards of 7.2 GB VRAM on my end if I set my settings to 7.2 GB game application usage. Problem is already solved on that front.

There's no reservation. There are no rules to it. Game uses all free VRAM. No need to strand 8 GB cards to 6.4 GB. I'll repost what I've written in DF article thread. But I feel like I'm talking to a bunch of walls, no one taking me seriously, just want to pile on the problem when it is not even a problem.

"

It doesn't actually do that, game will happilly use whatever free VRAM you have. Stutters they see happen even within the boundaries of threshold so they were just seeing regular asset streaming stutters that will happen regardless on limited budgets if you're close to card's limits regardless. Their video have hitches here and there even with textures set to medium and within the threshold...

Native 1440p
Game application VRAM meter: 7671 MB
Total: 9277 MB
In-game usage: 7400-7500 MB



Played for 12 hrs with game application VRAM at 7600 MB. Only crashed 3 times. The stutters you see in the video does happen even more infrequently if I do not record (due to recording, VRAM usage dropped a bit ) And I'd say that's fair considering how unstable the game is in general. They're not really immersion breaking or severe as shown in the analysis video. And a frame cap really gets rid of them most of the time. The PS5 running %30 faster than equivalent hardware is in effect here, making the 3070 barely get 55-60 FPS at native 1440p. Not much I can do about that. It sucks. What I question here is that stutters happen even if you're within the boundaries. But crashes are so rare I could see them happening regardless (as they crashed too)

1440p DLSS Q
Game application VRAM meter 7233 MB
Total: 8839 MB
In game usage: 7332 MB


Aside from rare stutters that happen in transition scenes, stutters only happen when you open the doors to a new scene which also happens on their video, or in their video, game produced a huge stutter when the car explosion happened. Even that does not happen on my end (neither at 1440p or 1440p DLSSQ). I specifically get a repetable stutter when I move past the first soldier group and when the jeep arrives. And they're really microstutters. In their video, total VRAM usage is around 7-7.1 GB and dedicated game VRAM usage is around 6.2-6.4 GB (since they set the game to medium textures to fit within the 8 GB boundaries based on game's claims of 1.6 GB OS usage) Most stutters I experience they also experience with 6.4 GB usage. So I don't even think stutters I get has anything to do with threshhold.


These transitional stutters are preferrable to N64 textures. I'd like see them give us a better texture quality option, still.

"

I'm pretty sure some people here will try to nitpick rare hitches here and there to show "ha, you went above threshold and u get them!!" But no! Even if you strand yourself to 6.4 GB, these hitches still exist! They do not go anywhere. DF video is there. Even with 6.4 GB VRAM usage, game hitches a lot on their end. As a a matter of fact, game hitches less frequently and less severe on my end DESPITE pushing the game to higher VRAM usage. That tells something.
 
PS5 is using high settings not ultra. Ultra and high settings aren't looking different using human eyes.

So then why make the assumption that PS5 is using high if there is no official confirmation of this and no one can make out any difference between the two presets? We are better off saying PS5 and the 4090 were running very similar settings in the DF analysis. Avoids muddying up an already muddy situation even further.
 
Back
Top