Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

He found his by now world renown 2070 to output an average of 35 frames at 4k/high. Naturally the ps5 is 35% faster at "same settings". Then a vanilla 6800 he claims it's twice the power of a 2070 when in reality it's around 50% at 4k and with that card he gets near 60 frames on average, at ultra this time, at 4k. How does a card that's 50% faster than a 2070 output 70% more performance at higher quality levels ?

I so wish Alex would handle DF's video for this game :D
Alex likely won’t even bother testing the vrr mode on the ps5 so…
 
Matching the console cpu for the most pc

Zen 2700x is known to be bottlenecking things, and has little to do with conducting GPU performance. If you conduct GPU performance you dont want it to be limited by the CPU to begin with, which is almost certainly what is happening here. Zen 2, as found in the PS5 is already quite the improvement. I also doubt that the 2700x is the most common PC cpu, but thats not the point of these tests anyway.

IGN is basically comparing mid-range 2018 hardware to the late 2020 PS5, weird mismatch of settings and claiming that the 2070 OC is basically a 2070S which is false too. Their putting the PS5 in a much better light against other hardware then its worth. IGN used to be better then this, what happened to them?
 
via Digital Trends

"Our 4K benchmarks show the video memory limitations of Uncharted Legacy of Thieves. Above the Low preset, the 8GB available in the RTX 3060 Ti became maxed out, so the results for the Medium, High, and Ultra presets are much tighter than they are at 1440p and 1080p"


Considering the textures were not barely improved (if at all?) from the PS4 version, these extreme vram requirements are certainly not what I would have expected, even though I was not hoping for the best based on that early chart. There's no reason for vram usage to be this high.

Edit: Ok I have seen some more reports now, and the port summary seems to be:

1666123061481.png

Definitely needs a patch or two, and no HDR is bizarre*. But a 3060ti seems to get 4k/60 with DLSS, which is what I would have hoped for.

*It has HDR. The option doesn't show up unless you enable Windows HDR before you boot the game.
 
Last edited:
To be honest I thought that video was higher quality than some of his previous entries. For the most part I didn't have any real problem with it as a 4K/fidelity mode comparison given he does call out the VRAM limitations on the 2070 impacting performance. I do find it strange that given that he doesn't spend any time at all comparing performance/1440p mode to the 2070 which would have cleared up the VRAM issues.

I did pull out a few other issues but I think for the most part they're just nitpicks:

  • Shader compilation - at the beginning of the video he seems to be running the game while the shaders are still compiling but then calls out big stutters in the PC version?
  • He noted some settings (AO and reflections were called out specifically) that showed no appreciable difference between High and Medium, yet he tests at all High settings.
  • Similarly, his comparison to the 6800 is at all Ultra settings despite saying it can have a 13% performance impact over High.
  • Apparently 4K DLSS Quality mode offers only a "small quality improvement" over native 1440p with the trade off being some extra ghosting. That's certainly a unique take on the advantages of DLSS 2.
Overall though, in none RT scenario's we should expect the PS5 to be faster than his 2070. Particularly so in vram limited scenario's like this. So that performance lead doesn't seem too extreme given the port clearly still has some issues and the comparison to the 6800 looks ballpark correct considering he's running it at higher settings than the PS5.

It'll be interesting to see a comparison to the less vram constrained 3060 in 4K mode or the 2070 in 1440p / performance mode. At least after a patch or two to work out the early bugs.
 

I dont understand why IGN is being this sloppy, the backlash on YT should be telling them something.

Overall though, in none RT scenario's we should expect the PS5 to be faster than his 2070. Particularly so in vram limited scenario's like this. So that performance lead doesn't seem too extreme given the port clearly still has some issues and the comparison to the 6800 looks ballpark correct considering he's running it at higher settings than the PS5.

It'll be interesting to see a comparison to the less vram constrained 3060 in 4K mode or the 2070 in 1440p / performance mode. At least after a patch or two to work out the early bugs.

A well-named outlet as IGN shouldn't be testing high quality AAA games on a 2700x that was released april 2018, its a CPU that is limiting even a 2070 in Spiderman PC probably the same for miles morales. A 2070 OC is not a 2070 Super either, its not just clock increases for it to become a Super. A true 2070 Super, which more often than not are OC'ed per default, is a good match for the PS5 in raw raster at the very least, probably better in most cases. In RT its going to be faster everywhere. However a vanilla 2070 teamed to a 2700x is going to be problematic, then IGN is also wrongfully claiming things using settings that dont really match the PC<>PS5 version.

To note, its a port to begin with, with further patches and optimizations surely coming along anytime soon. You cant just make general claims based on day one ports native to another platform, regarding how a certain GPU comapres to another one.

IGN is losing grip because of their sloppy 'analysis'.
 
I still find his videos somewhat messy with how he jumps all around between different sets of hardware, but I don't think it was that bad. The shader stuff in the beginning what him just explaining what happens when you don't allow the cache to be built.

Sucks there's some bugs and issues like LOD being lower, and no HDR... but overall it seems like a pretty good port. Hopefully Iron Galaxy quickly patches the game up and fixes these outstanding issues, and listens to fan feedback.
 
  • Like
Reactions: snc
PC performance review by Computer base: https://www.computerbase.de/2022-10/uncharted-legacy-of-thieves-benchmark-test/

  • GPU to hit 60fps on average at Ultra settings at 1080p = RTX3060
  • GPU to hit 60fps on average at Ultra settings at 1440p = RTX3060ti
  • GPU to hit 60fps on average at Ultra settings at 2160p = 6800XT
  • 10GB VRAM required for 4k at Ultra settings
  • FSR 2 and DLSS provide better image quality than the games built in TAA upscaler at native 4k
  • DLSS quality mode offers a 30% performance increase and FSR 2 offers 27% performance increase over native
  • DLSS and FSR performance modes offer around a 50% performance increase while being slightly better than native
 
This is false as per the above review I posted.

At Ultra settings native 1080p a RTX3060 delivers a 1% percentile of 55.1fps and the 6600XT does 61.5fps.

At low settings they're both well over 60fps minimum.
Per the link above ultra to low nets you 13% on Nvidia GPUs. That would make it a low of barely over 60 assuming that test sequence is even the worst possible case in the entire game.
 
Per the link above ultra to low nets you 13% on Nvidia GPUs. That would make it a low of barely over 60 assuming that test sequence is even the worst possible case in the entire game.

There's other reviews that show higher gains, Digital Trends showed 18% for example.

Others have shown less and others have shown more, I'm expecting a decent performance jump when Nvidia and AMD release new drivers.
 
Last edited:
There's other reviews that show higher gains, Digital Trends showed 18% for example.

Others have shown less and others have shown worse, I'm expecting a decent performance jump when Nvidia and AMD release new drivers.
So 15% lets say. My statement still stands as accurate given the current data.
 
Back
Top