Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
Good question. Doesn’t Naughty Dog rely heavily on precomputed lighting? You would expect a modern GPU to demolish their games.
Other than the flashlight every light in the game is precomputed if I'm not mistaken. I don't see how the blame for the performance can be put on anything other than poor optimization with DX12 likely being a major factor in that.


I fail to see any meaningful improvements over TLOU 2 that disqualify us from considering this a PS4 game.
 
Last edited:
What does this even mean? What advanced rendering feature set is TLOU:RM employing over Returnal?
Returnal has a pretty obviously narrow rendering feature set (to its credit!) -- it has a small number of pbr shader variants, a few rarely used transparents, the raymarched fog voxel fog, the special effects glow stuff, and a few one off vfx. This is a big part of why it looks and runs so good -- it has a narrow focus that it executes very well.

TLOU has a complex skin shader that surely has variants for LODs and possibly different characters/circumstances, eye shader(s? In their last game they had retroreflective variants for animals), baked lighting with probes/etc for sampling (which means variants for that stuff spread across all the other shaders), lots of translucents, that dynamic flashlight thing, foliage shaders, dynamic water animations, dynamic world position offset reading from sometimes realtime buffers, variants that interact with particles, wetness effects, multiple kinds of destruction/dynamic physics on the gpu (pre-baked with vertex index textures, simulation, etc) looks like multiple different decal systems... theres a lot more going on, across a wide variety of content = a lot more shader variants, a lot more work to schedule.

It's not a matter of "advanced" -- both teams are great and made the right choics for their content -- it's a matter of more.
 
Last edited:
So taken into account the ~25% perf hit Ultra mode has over High
Ultra has a host of major GPU intensive features (it even says so in the menu) that are definitely not available on PC, many heavy screen space ambient occlusion/reflections effects, bounce lights from flash light, extended view distance for all objects .. etc. The impact of Ultra over high is definitely substantial.
 
You can not compare a PC and a console version without knowing what the console version uses as settings. Even a slighty better quality can double the time to process this. Use Cyberpunk's SSRs ultra setting as an example, which cuts performance in half for a slighty better reflection quality.

This is one of the huge problem with Rasterizing. Increasing image quality comes which a huge lost in efficiency. That is the reason why Raytracing gets so much attraction. It has higher quality and better efficiency than Rasterizing.
 
Wasn't the case for Returnal.
Return so may not have been as heavily optimized. Like there’s one thing to make a game run on PS5, there’s an entirely different thing we’re you are exploiting design paradigms that can only be done on a PS5; making the porting process an incredibly painful experience. It really comes down to how much must be rewritten for PC.

When it comes to MP titles, performance isn’t compromised on PC, it’s compromised on console.
 
Last edited:
Results are in, the 4090 achieves 83 fps at 4K (4.6X the 3060 which achieves just 18fps).

4090: 83
4080: 62
7900XTX: 55
4070Ti: 46
6900XT: 38
3080Ti: 37
2080Ti: 25

Not too bad to be honest (PS5 achieves 35fps at 4K native probably using less than PC's Ultra settings).

Man that performance is absolutely horrendous. It's far and away the worst performance of any console port released this generation. To my knowledge there has never been a game that the 2080Ti doesn't comfortably outperform all current gen consoles in.

I hope this puts into perspective how a multiplatform game will have a lot more performance compromises in terms of optimization over optimizing for a single platform.

I really don't think this is a case of "closed platform vs open platform optimisation" at all. This is a case of horrendously rushed port. The chronic crashing would be evidence enough of that without the abhorrent performance. It's not like any previous single platform games port has ever performed this badly. This is a huge outlier in that respect, and the timing with the HBO series is a further smoking gun. This isn't just coming in hot, it's a flaming plane wreck crash landing to Earth.
 
About 3.5-3.6x depending on which resolutions you draw performance scaling numbers from.
This is a stupid question I will look stupid asking it but here goes. I thought 4090 was 8 times the general GPU power of the PS5 and not 4 times? I know flops are not a real measurement and all, but considering how much power that component requires among other things I could have sworn it was much stronger than just 4x
 
I really don't think this is a case of "closed platform vs open platform optimisation" at all. This is a case of horrendously rushed port. The chronic crashing would be evidence enough of that without the abhorrent performance. It's not like any previous single platform games port has ever performed this badly. This is a huge outlier in that respect, and the timing with the HBO series is a further smoking gun. This isn't just coming in hot, it's a flaming plane wreck crash landing to Earth.
I think we are looking at 2 sides of the same coin here. With MP releases, they release day and date as console. That means the design paradigm works for all of them. When you have a quick and well running port that likely means you chose a design paradigm that is wide enough to support most of PC. And when you are rushing to get your PC port done and it’s flops, I think that’s a sign of how much more work needed to be done for the port.

Like if we consider single threaded code is the most compatible but least efficient, I’m sure we can eventually get to most efficient least compatible.

Hope that makes sense.
 

Why medium textures have to be this bad? Why there's no in betweens? How hard it is to have middleground textures that do not look like they came right out of a PS3?

These textures are what you to contend with if you have <8 GB VRAM GPU (and 8 GB barely enough to push stable framerates at 1080p with high textures). There are games that only uses 4 GB VRAM at 1440p to achieve higher quality textures. What gives? Are textures really not scalable at all? Is it be all and end all once a texture is made for high VRAM usage? What gives? Isn't there a better method to save VRAM when reducing the quality of these textures?

I'm just trying to understand here, really.
 
This is a stupid question I will look stupid asking it but here goes. I thought 4090 was 8 times the general GPU power of the PS5 and not 4 times? I know flops are not a real measurement and all, but considering how much power that component requires among other things I could have sworn it was much stronger than just 4x
It is much stranger. Limitation isnt hardware, it is software. Thats the reason why a 4090 is 10x faster or so with Pathtracing than a PS5.
 
It is much stranger. Limitation isnt hardware, it is software. Thats the reason why a 4090 is 10x faster or so with Pathtracing than a PS5.
And rasterization as well? It's just software holding it back in games with the GPU power not being utilized to the max?
 
And rasterization as well? It's just software holding it back in games with the GPU power not being utilized to the max?

Not in rasterization.
 
Last edited:
I see. So flop number is misleading for "normal" graphics rendering...

There is other stuff like all fixe function ROP, Texture unit, fillrate with rasterizing and it depends of the architecture. For example performance per flops is better on Turing than Ampere.
 

Okay, I've managed to find a resemblance of stability in performance with a cap of 45 FPS. Hitches here and there but overall it is okay. Reduced everything related VRAM other than textures to low/medium, seemed to help. Of course, everything in the background is nuked. Video is choppy because I had to record it in 30 FPS as recording at higher framerate destroyed GPU bound performance for some reason, also pushed some extra memory usage. The video is like a slide show but in game it was pretty smooth with VRR.

Game refuses to use more than 7.2 GB similar to RE4, and not much I can do about that. This is a problem Microsoft/NV should address very soon. All graphics memory should be usable if the user has free memory. The assumption or the goodwill of leaving some VRAM unused for background apps is mind boggling for people like me that only wants to focus the game and turn everything off.

Targeting 1440p with these settings with all background apps disabled should allow 8 GB smoothly play the game, I'd bet. Sadly, I cannot bring myself to play at 1440p since 4K/DLSS Performance looks so pristine even if its only using 1080p at its internal.
 
I agree, all the VRAM should be useable. Windows 11 is very efficient in that regard, it just uses around 200 MB when on desktop and not having any programs open.
 
I agree, all the VRAM should be useable. Windows 11 is very efficient in that regard, it just uses around 200 MB when on desktop and not having any programs open.
I've managed to force the game to use 7.5 GB VRAM by killing explorer.exe+other Windows stuff.

Stutters are lessened. I cannot however record gameplay without having stutters. Once I start recording with game bar or gexperience, VRAM usage goes down to 6.7 GB from 7.5 GB and game enters into a stuttery state. Oh well. :runaway:
 
Man that performance is absolutely horrendous. It's far and away the worst performance of any console port released this generation. To my knowledge there has never been a game that the 2080Ti doesn't comfortably outperform all current gen consoles in.
There has been one. Can you guess which? Uncharted 4. The 2080 Ti is only a smidge faster than the PS5 overall in that game despite being consistently 40-50% in other games.
 
Status
Not open for further replies.
Back
Top