Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
And yet it is exactly what's needed on Social Media and perhaps too tame.

I don't think it's unwarranted as hey, yeah the 'feedback' DF gets, especially on Twitter is often insufferable - there are accounts on Twitter that only exist to respond to DF videos accusing them of being Sony/Xbox/PC shills, a truly desolate online existence.

From a pragmatic/tactical perspective as a professional at a major outlet though...I wouldn't recommend going any route that could be taken as goading, even if morally justified (especially when it's at an unidentified group vs. a single individual being overly hostile where the context is clearly visible). There's always going to be a gross imbalance with the respect you grant the public vs. the respect they grant you in kind when you're getting paid to offer your opinion, 'fair' really doesn't come into it. It's just not wise from a public relations angle.

As long as games do not present me Half Life 1 like textures to fit 6-8 GB VRAM, I wouldn't have any problems with VRAM.

It depends - there's a difference in not being able to take advantage of custom textures intended for high-end PC GPU's, and not being able to match the texture quality of $400 consoles. For TLOU and RE4, 8GB cards can get worse results - in spots - without even going into the territory of Ultra settings that are only intended for advanced PC's (or just to appease those who want to see the word 'Ultra'). That's where I think people who shelled out for a 3070, and soon a 4060ti may be justifiably ticked a little.

Ideally all games handle this like Doom Eternal - using higher textures that overflow your vram, the game just starts running slower, but you don't crash or get huge halting stutters. Of course ideally, all games would be coded like Doom Eternal, no shit. :)

Yes this is indeed one of the more obvious ways that this game could have been better tailored to the PC (if time and budget had allowed) rather than shoehorning in the PS5 optimised solution. I wouldn't be surprised if NXG for example held up the CPU based Kraken decompression on PC that is no doubt contributing to the high CPU requirements vs the hardware based equivalent on PS5 as a shining advantage of closed systems. Where-as the optimal solution for PC would obviously have been to compress all the GPU based assets with GDeflate and GPU decompress them via DirectStorage which would save CPU power both in general IO overhead as well as decompression.

How I thought this would be handled before DirectStorage's GPU decompression became commonplace would be exactly what you've seen in many Series X versions of games vs PS5, in that the SX version can often have a larger install footprint - sometimes significantly so - because it's likely using its own texture compression format that's not as compressed as Kraken, not only due to the proprietary nature of that format (?) but also in order to be more performant on SX's hardware.

Afaik, the PC and PS5 install sizes for TLOU:RM are basically identical - ~90gb, that and of course with the Oodle dll's indicates the textures are compressed in the exact same format as the PS5 - textures that are in that format precisely due to their affinity to the PS5's hardware. I would not have expected we would get a game with the exact same compression format highly tailored to a custom hardware decoder and effectively said "Welp, hope you have 8 cores, good luck!".

Granted, if there was another format easier on CPU decoding, that may have ultimately just pushed the bottleneck to I/O, or required substantial changes to the streaming system where we're again in cost-prohibitive territory for the budget of a port, who knows.
 
The problem with VRAM is, most games doesnt need it and putting just more on a GPU only increases the cost. Why does a game need more than 8GB in 1080p and 10GB in 1440p, when it just run fine with 16GB in 4K? Shouldnt it need at least 32GB because 4K has 4x the pixel of 1080p?

AMD and Intel have seemingly managed to equip their GPU's with ample vram without blowing our their margins apparently.
 
That it is a last-gen game is irrelevant. Horizon Forbidden West is one of the best-looking games on the market with incredible assets yet it runs on a PS4 just fine and still looks good there as well.

TLOU Part I to justify this kind of performance would have to look above everything else right now but simply doesn’t. The fact that it has roots on the PS3, and features small and constrained environments with no RT to speak of makes its performance profile on PC suspect at best.
I get your point but as DF also showed on their video, while the basic skeleton of hfw was preserved on PS4 with the art holding up well enough, there were very deep cuts to the presentation in order to get it running. It's not like it's literally a PS4 game with higher res and fps on PS5 like say got of war(although I think that game is no slouch in looks department either)

Hfw on PS5 looks stunning on PS5 with so many details packed in some people say it's too much. While on PS4 it looks good but clearly not better than zero dawn outside of certain things like improved facial animation and water tech(although this is also simplified from PS5)
 
That it is a last-gen game is irrelevant. Horizon Forbidden West is one of the best-looking games on the market with incredible assets yet it runs on a PS4 just fine and still looks good there as well.

TLOU Part I to justify this kind of performance would have to look above everything else right now but simply doesn’t. The fact that it has roots on the PS3, and features small and constrained environments with no RT to speak of makes its performance profile on PC suspect at best.

Yeah I think part of the outrage here is that there are more similarities than not between the overall aesthetic in the TLOU:RM and TLOU:2, which runs on a PS4 with similar visual quality. Of course RM is more advanced, but it's not surprising PC players are a bit shocked when a slow-walking, linear adventure game is hitting 80% CPU in a small room with one other character. It can look good, but a lot of that is extremely careful cubemaps, baked lighting and art asset quality. It's just not something that you would expect to put a 4090 and 13900k through the wringer at first glance.

Just spinning the camera around will often cause framedrops and frametimes to

Yes, I noticed this as well, the variability in CPU/GPU load is bizarre. First I thought it was motion blur and disabling it can help a bit, but this game just has so many wild swings in resource usage.

I mean when I saw CPU spikes in Spiderman when jumping over the top of a skyscraper to reveal the entire gameworld in a split second, I get that. I'd prefer it not too, but it's readily understandable. In TLOU, I can get huge spikes walking up to another floor in a house.
 
Last edited:
Afaik, the PC and PS5 install sizes for TLOU:RM are basically identical - ~90gb, that and of course with the Oodle dll's indicates the textures are compressed in the exact same format as the PS5 - textures that are in that format precisely due to their affinity to the PS5's hardware. I would not have expected we would get a game with the exact same compression format highly tailored to a custom hardware decoder and effectively said "Welp, hope you have 8 cores, good luck!".

Are you saying that Oodle is a specific format for PS5 and less suited for other platforms?
If that is the case, I do not agree with you. Mainly because Sony made the hardware decoder to work with RADs formats and not the other way around.
Also Oodle is multiplatform, it runs just fine on PC and Xbox and PS4, with not that much overhead. Thats kinda been their selling point all the time.
Now if you are saying that TLOU 1 for PS5 got rewritten to use the PS5 streaming solution (VT?) and that ports badly to the PC, fine, but that is not Oodle + Kraken.

 
Are you saying that Oodle is a specific format for PS5 and less suited for other platforms?
If that is the case, I do not agree with you. Mainly because Sony made the hardware decoder to work with RADs formats and not the other way around.
Also Oodle is multiplatform, it runs just fine on PC and Xbox and PS4, with not that much overhead. Thats kinda been their selling point all the time.
Now if you are saying that TLOU 1 for PS5 got rewritten to use the PS5 streaming solution (VT?) and that ports badly to the PC, fine, but that is not Oodle + Kraken.


Oh I know Oodle has been used in other platforms before, I've seen the splash screen on PC often enough, but yeah I can't speak with any deep knowledgebase here specifically (hence the question mark: "the proprietary nature of that format (?) "). I'm just guessing the heavy lifting part of this streaming system is the texture decompression based on the CPU but I really have no clue.

I'm just curious though, why Series X games have often have such larger install sizes than their PS5 counterparts, if not due to a less compressed texture format? What would be the point of that if not decompression performance?
 
Last edited:
I'm just curious though, why Series X games have often have such larger install sizes than their PS5 counterparts, if not due to a less compressed texture format? What would be the point of that if not decompression performance?
Probably because MS are dumb and Cerny is a mad genious :p No, just joking, I have no idea.
I seem to remember that it has been speculated on here that XB packs up X+S in the same package or maybe it was the other way around, no idea.

But if I would speculate, then I do remember that Sony 1st party studios have shared that they have spent a lot of time on optimising storage and streaming to avoid duplication of assets stored etc .
Like the Spiderman GDC (or was it post mortem) that went into details about how they shifted from storing everything multiple times and also some things got calculated on the fly instead of storing the data on the disc.
This was an internal revelation for Insomniac that came around since PS3 + PS4 and different extra/spare resources. So your strategy for storing or not storing changed with that generation change. I assume de-duplication is even more agressiv with PS5.


It could be as simple as Sony has had more focus on that bit than MS, especially since Sony cares only about 3 different models (PS4, PS4Pro and PS5) so less to optimise for. While MS has Xbox X, S, One and the PC, also PC got a myriad of profiles that probably needs to be thought about. So MS might just go for the brute force approach and apply resources tot he issue instead of "elegance".

TLDR I have no idea, somebody on here knows much more about this than me.
 
The problem with VRAM is, most games doesnt need it and putting just more on a GPU only increases the cost. Why does a game need more than 8GB in 1080p and 10GB in 1440p, when it just run fine with 16GB in 4K? Shouldnt it need at least 32GB because 4K has 4x the pixel of 1080p?

All Assets have to scale with the resolution. It doesnt make sense to use 4K textures and objects in 1080p.

That would be the case to some extent if all assets actually scaled with resolution. But on the PC the graphics settings are typically decoupled and largely independent from resolution.

For example games typically aren't enforcing textures at 4k to be x4 the resolution of textures at 1080p.
 
But that will be necessary if you want to sell your AAA game to more than only >10GB GPU owners. The medium texture setting in TLoU doesnt even load last gen quality textures and this setting is the only one for 6GB in 1080p and 8GB GPU owners playing in 1440p.

This guy uses medium textures on a 3060TI with DLSS quality in 1440p:
And this is native 1440p medium:

Less than 60 fps on a 3060TI and still this game wants more than 8GB. For example this was the PS4 Pro version:
 
Last edited:
Sigh, I remember having discussions about the validity of ray tracing inclusion in games when the install base of RT capable GPUs was low, only for a rasterization game to come out and alienate 90% of of GPUs in one go with nothing more than high VRAM demand at basic 1080p resolution, the game can't even meet PS4 level of visuals on these GPUs.

It's truly a depressing time for PC gaming.
 
This is no different to what we had in 2013 when PS4 and Xbone released as even high end GPU's didn't have as much VRAM as they did.

People have just forgotten.
 
Last Gen Sony and Microsoft went from 512mb to 8GB, this gen from 8 to 16GB. Last Gen was 1080p with 8GB (from 720p with 512mb!), now it is 4K with 16GB. 4GB was enough for 1080p:
rottr_1920_1080.png
 
Last Gen Sony and Microsoft went from 512mb to 8GB, this gen from 8 to 16GB. Last Gen was 1080p with 8GB (from 720p with 512mb!), now it is 4K with 16GB. 4GB was enough for 1080p:
rottr_1920_1080.png

The fact you posted a chart that shows GTX970 SLI as '8GB' tells me everything.

And the current consoles aren't doing 4k with 16GB as they have many, many games well below that resolution.

The only next generation thing we have on them is The Matrix UE5 demo, where they are sub 1080p.
 
That would be the case to some extent if all assets actually scaled with resolution. But on the PC the graphics settings are typically decoupled and largely independent from resolution.

For example games typically aren't enforcing textures at 4k to be x4 the resolution of textures at 1080p.
Texture sizes have no relationship to resolution however, or even how sharp they are. That’s why enforcement is pointless in the virtual texturing era. Textures should always be sharp.

Texture pools should increase with frame buffer, but not to the extent of several GB more.

Which is why I find TLOU an anomaly here
 
I've spent some time with the game now and I'm not sure what the grand mystery is. The game is VRAM intensive and that's basically all there is to it.

I'm playing on a 9900k + 2080 Ti at 1440p and it performs perfectly well even on this old bucket. Performance scales fairly linearly down to 720p where the CPU becomes the bottleneck while going above 1440p starts flooding the PCIe bus as capacity becomes an issue. I'm otherwise not seeing anything out of the ordinary except that it needs more video memory than last gen games.

It's the whole reason why I bought a 2080 Ti to begin with. I didn't want to be saddled with an 8GB GPU because it was plainly obvious that this would happen even back in 2018. I just expected it sooner.
 
I've spent some time with the game now and I'm not sure what the grand mystery is. The game is VRAM intensive and that's basically all there is to it.

I'm playing on a 9900k + 2080 Ti at 1440p and it performs perfectly well even on this old bucket. Performance scales fairly linearly down to 720p where the CPU becomes the bottleneck while going above 1440p starts flooding the PCIe bus as capacity becomes an issue. I'm otherwise not seeing anything out of the ordinary except that it needs more video memory than last gen games.

It's the whole reason why I bought a 2080 Ti to begin with. I didn't want to be saddled with an 8GB GPU because it was plainly obvious that this would happen even back in 2018. I just expected it sooner.

I've just finished the game and no issues at all, locked 60fps on Ultra settings at native 1440p all the way through.
 
I've spent some time with the game now and I'm not sure what the grand mystery is. The game is VRAM intensive and that's basically all there is to it.

I'm playing on a 9900k + 2080 Ti at 1440p and it performs perfectly well even on this old bucket. Performance scales fairly linearly down to 720p where the CPU becomes the bottleneck while going above 1440p starts flooding the PCIe bus as capacity becomes an issue. I'm otherwise not seeing anything out of the ordinary except that it needs more video memory than last gen games.

It's the whole reason why I bought a 2080 Ti to begin with. I didn't want to be saddled with an 8GB GPU because it was plainly obvious that this would happen even back in 2018. I just expected it sooner.
Yeah, the 2080 Ti is still an above-average GPU even now.
 
Status
Not open for further replies.
Back
Top