What is so complex about this title that a 3080 barely outperforms a PS5?
Good question. Doesn’t Naughty Dog rely heavily on precomputed lighting? You would expect a modern GPU to demolish their games.
What is so complex about this title that a 3080 barely outperforms a PS5?
Other than the flashlight every light in the game is precomputed if I'm not mistaken. I don't see how the blame for the performance can be put on anything other than poor optimization with DX12 likely being a major factor in that.Good question. Doesn’t Naughty Dog rely heavily on precomputed lighting? You would expect a modern GPU to demolish their games.
Returnal has a pretty obviously narrow rendering feature set (to its credit!) -- it has a small number of pbr shader variants, a few rarely used transparents, the raymarched fog voxel fog, the special effects glow stuff, and a few one off vfx. This is a big part of why it looks and runs so good -- it has a narrow focus that it executes very well.What does this even mean? What advanced rendering feature set is TLOU:RM employing over Returnal?
Ultra has a host of major GPU intensive features (it even says so in the menu) that are definitely not available on PC, many heavy screen space ambient occlusion/reflections effects, bounce lights from flash light, extended view distance for all objects .. etc. The impact of Ultra over high is definitely substantial.So taken into account the ~25% perf hit Ultra mode has over High
Return so may not have been as heavily optimized. Like there’s one thing to make a game run on PS5, there’s an entirely different thing we’re you are exploiting design paradigms that can only be done on a PS5; making the porting process an incredibly painful experience. It really comes down to how much must be rewritten for PC.Wasn't the case for Returnal.
Results are in, the 4090 achieves 83 fps at 4K (4.6X the 3060 which achieves just 18fps).
4090: 83
4080: 62
7900XTX: 55
4070Ti: 46
6900XT: 38
3080Ti: 37
2080Ti: 25
Not too bad to be honest (PS5 achieves 35fps at 4K native probably using less than PC's Ultra settings).
I hope this puts into perspective how a multiplatform game will have a lot more performance compromises in terms of optimization over optimizing for a single platform.
This is a stupid question I will look stupid asking it but here goes. I thought 4090 was 8 times the general GPU power of the PS5 and not 4 times? I know flops are not a real measurement and all, but considering how much power that component requires among other things I could have sworn it was much stronger than just 4xAbout 3.5-3.6x depending on which resolutions you draw performance scaling numbers from.
I think we are looking at 2 sides of the same coin here. With MP releases, they release day and date as console. That means the design paradigm works for all of them. When you have a quick and well running port that likely means you chose a design paradigm that is wide enough to support most of PC. And when you are rushing to get your PC port done and it’s flops, I think that’s a sign of how much more work needed to be done for the port.I really don't think this is a case of "closed platform vs open platform optimisation" at all. This is a case of horrendously rushed port. The chronic crashing would be evidence enough of that without the abhorrent performance. It's not like any previous single platform games port has ever performed this badly. This is a huge outlier in that respect, and the timing with the HBO series is a further smoking gun. This isn't just coming in hot, it's a flaming plane wreck crash landing to Earth.
It is much stranger. Limitation isnt hardware, it is software. Thats the reason why a 4090 is 10x faster or so with Pathtracing than a PS5.This is a stupid question I will look stupid asking it but here goes. I thought 4090 was 8 times the general GPU power of the PS5 and not 4 times? I know flops are not a real measurement and all, but considering how much power that component requires among other things I could have sworn it was much stronger than just 4x
And rasterization as well? It's just software holding it back in games with the GPU power not being utilized to the max?It is much stranger. Limitation isnt hardware, it is software. Thats the reason why a 4090 is 10x faster or so with Pathtracing than a PS5.
I see. So flop number is misleading for "normal" graphics rendering...It seems most of problem comes from ood
Not in rasterization.
I see. So flop number is misleading for "normal" graphics rendering...
I've managed to force the game to use 7.5 GB VRAM by killing explorer.exe+other Windows stuff.I agree, all the VRAM should be useable. Windows 11 is very efficient in that regard, it just uses around 200 MB when on desktop and not having any programs open.
There has been one. Can you guess which? Uncharted 4. The 2080 Ti is only a smidge faster than the PS5 overall in that game despite being consistently 40-50% in other games.Man that performance is absolutely horrendous. It's far and away the worst performance of any console port released this generation. To my knowledge there has never been a game that the 2080Ti doesn't comfortably outperform all current gen consoles in.