Uncharted: Legacy of Thieves Collection [PS5, PC]

Oh yeah, i can assure it is (got a 2gb 960 around).

It's roughly GTX 680 level if I recall correctly. But on a more modern architecture that suffers from fewer of the legacy issues that Kepler did. That said it no longer receives driver support so that should significantly impact it's potential. I suppose it's possible they roughly reserve low settings for the PS4, medium for the PS5 and High for PC exclusive enhancements but we'll have to wait and see.
 
It's roughly GTX 680 level if I recall correctly. But on a more modern architecture that suffers from fewer of the legacy issues that Kepler did. That said it no longer receives driver support so that should significantly impact it's potential. I suppose it's possible they roughly reserve low settings for the PS4, medium for the PS5 and High for PC exclusive enhancements but we'll have to wait and see.

960 is still supported, GTX6xx got their latest driver spring this year. NV's gpu's (kepler) didnt age that well due to architectural differences. However GCN gpus did age much better, hd7850/hd7870 held up accordingly vs the base PS4, mostly outperforming it.
I think 960 is very close to the 670 actually, but as you say doesnt suffer from the kepler issue and is still supported. I still have a 670 and 960 around.

Also worth mentioning is that to gauge system performance (pc vs ps4), you shouldnt be looking at ports anyway as it can favor each platform eitherway. Multiplatform titles usually give a much better picture.
 
I think 960 is very close to the 670 actually,

Yeah now that you mention it I seem to recall that was the case. It's been a long time lol.

Rgd driver support I'm really talking about specific game performance optimisations rather than official support in terms of "we guarantee it will work". I think those kind of optimisations (the kind every console game will receive) ended years ago for Maxwell, and even Pascal now.
 
Yeah now that you mention it I seem to recall that was the case. It's been a long time lol.

Rgd driver support I'm really talking about specific game performance optimisations rather than official support in terms of "we guarantee it will work". I think those kind of optimisations (the kind every console game will receive) ended years ago for Maxwell, and even Pascal now.

Nimez drivers (custom amd drivers) give todays optimizations to gcn1/2 gpus like hd290, 280 and even hd7850/70 series. Doubles Halo Infinite performance on those.

And while Maxwell gpus wont be having thd kind of optimized support they once had, their holding up quite well even today aslong its not Kepler then.


Edit: found this at NV’s site, no idea how far they go with these kind of optimizations, atleast they promise optimizations as far back as Maxwell which is a 2014 arch with even Spiderman being mentioned for that arch


Effective October 2021, Game Ready Driver upgrades, including performance enhancements, new features, and bug fixes, will be available for systems utilizing Maxwell, Pascal, Turing, and Ampere-series GPUs. Critical security updates will be available on systems utilizing desktop Kepler-series GPUs through September 2024. A complete list of desktop Kepler-series’
 
Last edited:
Nimez drivers (custom amd drivers) give todays optimizations to gcn1/2 gpus like hd290, 280 and even hd7850/70 series. Doubles Halo Infinite performance on those.

And while Maxwell gpus wont be having thd kind of optimized support they once had, their holding up quite well even today aslong its not Kepler then.


Edit: found this at NV’s site, no idea how far they go with these kind of optimizations, atleast they promise optimizations as far back as Maxwell which is a 2014 arch with even Spiderman being mentioned for that arch


Effective October 2021, Game Ready Driver upgrades, including performance enhancements, new features, and bug fixes, will be available for systems utilizing Maxwell, Pascal, Turing, and Ampere-series GPUs. Critical security updates will be available on systems utilizing desktop Kepler-series GPUs through September 2024. A complete list of desktop Kepler-series’
Delivered performance in lots of new titles doesnt line up with that marketing speak.
 
Oh didnt even notice, AMD apparently released a new official driver for legacy GCN products, all the way back to HD7700 and up to R9 Fury. You dont even have to revert to Nimez modded drivers:


Thats for GPU's from 2012 and onwards, a staggering 10 year official optimized driver support. In contrast, Kepler just got booted last december with the last game-ready drivers, still almost 10 years.
And yes, a HD7850/70 for example does very well even today if you can live with base console settings.
 
And yes, a HD7850/70 for example does very well even today if you can live with base console settings.

From what I've seen they don't do very well due to 2GB VRAM they causing issues, PS4's GPU would have 3-4GB of it's RAM allocated for VRAM.

HD7950/70 and fair a lot better as that extra 1GB makes all the difference.
 
From what I've seen they don't do very well due to 2GB VRAM they causing issues, PS4's GPU would have 3-4GB of it's RAM allocated for VRAM.

HD7950/70 and fair a lot better as that extra 1GB makes all the difference.
that's a common problem for all 2 GB gpus from that era. up until 2017, they were mostly fine, but then we got nextgen games such as ac unity and above which started destroying 2 GB GPUs. either you had to use super extremely blurry ps2-like textures to get somewhat playable performance, or your performance would tank heavily in most cases

you can see here that 960 practically loses more than 2 times performance when it runs into a huge VRAM bottleneck

and devs really did not bother scaling their games for 2 GB GPUs. rdr 2, ac odyssey, origins, and many more AAA titles look absoutely horrible with low texture settings.

a game that maximizes ps4's potential usally requires 3.5-4 GB VRAM, so yeah, gtx 960 2 gb being "more powerful" than PS4 does not mean anything when it has awful amount of VRAM

funny thing is, there were a lot of NVIDIA users who justified 2 GB, saying 960 would not make use of 4 GB anyways. They said the same thing with 1060 6 GB, and now saying with 3060 12 GB, it just became a self fulfilling prophecy at this point.

thing is, back in 2015-2016, you had games like Witcher 3 that looked like a mix of ps3 and ps4 games. They did not stress VRAm quite enough, so people felt safe with 2-3 GB back then. but look at uncharted 4 and witcher 3. image quality delta between them is huge. uc4 uses enormously better textures and assets. witcher 3 was made by 2 GB cards in mind, and while it still somehow struggled to run on PS4, it did not max out PS4's memory, most likely. whereas uc4 gracefully runs on PS4, it would not even be possible to release that port in a gaming community full of 2 and 3 GB GPUs without making an extra effort of creating special texture set for lowed end VRAM amounts

i gather same thing will happen to 4-6 GB GPus now but Series S's existence also makes me doubtful. I really look forward to what kind of texture quality will be possible on 8 GB GPUs in nextgen games. One thing I'm sure is that even though not powerful as its brothers, 3060 will have no trouble playing with nextgen textures.

See the 1060, 6 GB felt like an overprovision at its release, but despite using medium settings, you can still use ultra textures in almost every game. And honestly, ultra textures + medium settings would look a hello fa better than medium textures + high settings
 
From what I've seen they don't do very well due to 2GB VRAM they causing issues, PS4's GPU would have 3-4GB of it's RAM allocated for VRAM.

HD7950/70 and fair a lot better as that extra 1GB makes all the difference.

Indeed, the tiny 2gb vram buffer was already quite small at the time even. Many GPU's went with that amount at the time, even when 3 to 6gb gpus were already out early 2012. I had a 670 4gb, 7950 3gb (modded to 7970 speeds), held up much better.
7870 etc being GCN1, R9 270/X for example are GCN2 and usually came with 4gb aswell. And these held up much better, aligning with the consoles actually.

you can see here that 960 practically loses more than 2 times performance when it runs into a huge VRAM bottleneck

Can confirm lol, the 960 is quite capable, but its 2gb VRAM is hugely bottlenecking it in almost every modern title.


thing is, back in 2015-2016, you had games like Witcher 3 that looked like a mix of ps3 and ps4 games. They did not stress VRAm quite enough, so people felt safe with 2-3 GB back then. but look at uncharted 4 and witcher 3. image quality delta between them is huge. uc4 uses enormously better textures and assets. witcher 3 was made by 2 GB cards in mind, and while it still somehow struggled to run on PS4, it did not max out PS4's memory, most likely. whereas uc4 gracefully runs on PS4, it would not even be possible to release that port in a gaming community full of 2 and 3 GB GPUs without making an extra effort of creating special texture set for lowed end VRAM amounts

It was crazy. I found 2GB low at the time.... Most gpu's had a 4GB variant and would have been worth the investment. 7950/7970 default came with 3 and 6gb.
 



4k/60 needs a 9900K and a 3080 lol

View attachment 6938

Yeah those specs in light of those graphics do not bode well for this port at all. As far as I can tell it looks exactly like the PS4 game. Why on Earth would it need an SSD??

But hey at least we can look forward to 4k, widescreen, unlocked framerates and controller support. Like they ever considered not implementing some of those things??
 
Back
Top