Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
GPUs are now more capable than ever before of smartly using VRAM to increase quality

But does this matter? Because didn't DX12 et al shift the onus of memory management more so onto the game itself and therefore developers?

I've posted this sentiment before, but I still feel given the realities and practicalities of the PC space that shifting optimization onus on the games themselves presents let's just say interesting complications unlike with the console space.

I agree, but I don't remember playing Ryze Son of Rome or Titanfall on my GTX 660Ti (3GB) and experiencing any VRAM problem, texture setting were pretty scalable visually back then. It never was this extreme.

From what I remember with Ryse VRAM usage in certain segments during the middle portions of the game was higher and would cause stuttering at max settings with my GTX 970. It wasn't even the entire level either, but just certain segments within said levels. If you only say played the first 1/3rd of the game, or likely with reviewers tested even earlier, you'd never run into this issue.
 
I think there is a difference to back in 2014-2015: the quality of texture options available now in 2023 is worse. 2 GB and 3 GB GPUs back then still had decent texture quality in that time frame.

The textures in TLOU Pt 1 turn to sub-ps4 quality if you have 8 GB of VRAM.
body.00_47_36_55.stil17eoi.png


If that is the quality of textures they can achieve with 8 GB Vram on GPUs that support the full DX12 PRT etc... then they need to rethink what they are doing... as it is shockingly wrong and frankly a bit embarrassing. GPUs are now more capable than ever before of smartly using VRAM to increase quality, yet ND did not manage to do that here and have failed rather spectacularly.
But you don't have to use medium textures on 8 GB budget, I've been playing fine with High textures without a problem at 4K/DLSS perf, even more workable at 1440p/DLSS quality. Only reduced visual effects texture quality to low. 1.6 GB is a bit misleading, it actually uses 7.6 GB on my end;

Vz4WpCh.jpg


Smooth frametimes, no issues, good performance, brilliant textures, and I'm simply enjoying the game (and I'm trying to let others know that they can too)

In game usage;

JSIgveP.jpg


I agree that medium textures should be made better for <8 GB. But sadly, wasn't this the case most of the time in most games? Turning textures to high or medium in most games caused an immense drop in texture quality for some reason.

Here's a comparison I made in AC Valhalla. Most would think the game would use a lesser texture setting on PS4, but actually all consoles use "High" texture option from the looks of it, and medium textures look like PS2 assets. Practically you either have enough VRAM in ACValhalla for high textures or you get PS2 textures.


Same for RDR2, the game literally has N64 textures ingrained with "high" texture setting.


I'm not defending the sad state of the port but I think that 1.6 GB thing is hugely misleading, causing a lot of 8 GB users to pick medium settings to make sure they're not in the "yellow zone" and be frustrated. I understand the frustration too.

Also I'd have to note, I disabled steam's hardware accerelation save a crucial, tipping point of 300 to 500 MB of VRAM. Substract that from that usage and suddenly you would see problems (because game application tells me that I use 7622 MB, and I barely have that as free VRAM with no hardware accerelated app on the background. This is a crucial trick most 8 GB users can use to achieve better visuals). Discord also uses VRAM. Discord+Steam+DWM is a deadly trio for 6-8 GB VRAM cards. Disable hardware accel and you will easily be saving 1 GB. Could be why they warn people that background apps use 1.6 GB even when they don't. They expect most people to use these software, or just Steam being Steam, or Epic being Epic will use VRAM (you cannot sadly disable hardware accel on Epic).

Of course this is at 4K DLSS perf. 1440p DLSS quality or 1440p is easily workable into 7.2-7.3 GB of "game application" memory, and I bet most people would have 7.2 GB free memory on idle if they don't open chromium browsers with hardware accerelation while playing the game. Chrome, Edge and such can also take upwards of 400+ mb vram, and even more if they leave alot of tabs open. I've been trying to communicate the stuff about hardware accerelation for a year now, 8 GB owners simply have to turn off their browsers or if they really want to browse while gaming, either use Steam's browser or turn off hardware accerelation. Same for Discord.


I still advocate for better textures on medium too. But rarely I see a game gracefully scale back to lower VRAM amounts. There are outliers like Forza Horizon 5 and Doom Eternal, but majority of games suddenly start to show you PS2 like textures only to save up 1 GB worth of VRAM. Maybe you should ask a developer why is that. Why they cannot scale down textures properly for lower VRAM usage.
 
Last edited:
From what I remember with Ryse VRAM usage in certain segments during the middle portions of the game was higher and would cause stuttering at max settings with my GTX 970.
Yeah, I played the whole game, and I remember those stuttery sections (mainly the soldier camps).
Did you crank Ryse settings to Ultra including textures?
Yeah, I always do.
It's not it's VRAM requirements, it's that the low and medium options have really bad textures thus forcing people to go with a setting their GPU can't handle to get a decent looking game.
Now we are on the same page, scalability has gone to hell.

used to be 17 minutes on my 13900K so if we’re down to 13 minutes we shaved 4 minutes lol
Yeah, nice improvement.

However the GPU shader difference here is HUGE, never seen any game do that. Why is that even possible! I am now curious about Arc GPUs.
 
But you don't have to use medium textures on 8 GB budget, I've been playing fine with High textures without a problem at 4K/DLSS perf, even more workable at 1440p/DLSS quality. Only reduced visual effects texture quality to low. 1.6 GB is a bit misleading, it actually uses 7.6 GB on my end;

Vz4WpCh.jpg


Smooth frametimes, no issues, good performance, brilliant textures, and I'm simply enjoying the game (and I'm trying to let others know that they can too)

In game usage;

JSIgveP.jpg


I agree that medium textures should be made better for <8 GB. But sadly, wasn't this the case most of the time in most games? Turning textures to high or medium in most games caused an immense drop in texture quality for some reason.

Here's a comparison I made in AC Valhalla. Most would think the game would use a lesser texture setting on PS4, but actually all consoles use "High" texture option from the looks of it, and medium textures look like PS2 assets. Practically you either have enough VRAM in ACValhalla for high textures or you get PS2 textures.


Same for RDR2, the game literally has N64 textures ingrained with "high" texture setting.


I'm not defending the sad state of the port but I think that 1.6 GB thing is hugely misleading, causing a lot of 8 GB users to pick medium settings to make sure they're not in the "yellow zone" and be frustrated. I understand the frustration too.

Also I'd have to note, I disabled steam's hardware accerelation save a crucial, tipping point of 300 to 500 MB of VRAM. Substract that from that usage and suddenly you would see problems (because game application tells me that I use 7622 MB, and I barely have that as free VRAM with no hardware accerelated app on the background. This is a crucial trick most 8 GB users can use to achieve better visuals). Discord also uses VRAM. Discord+Steam+DWM is a deadly trio for 6-8 GB VRAM cards. Disable hardware accel and you will easily be saving 1 GB. Could be why they warn people that background apps use 1.6 GB even when they don't. They expect most people to use these software, or just Steam being Steam, or Epic being Epic will use VRAM (you cannot sadly disable hardware accel on Epic).

Of course this is at 4K DLSS perf. 1440p DLSS quality or 1440p is easily workable into 7.2-7.3 GB of "game application" memory, and I bet most people would have 7.2 GB free memory on idle if they don't open chromium browsers with hardware accerelation while playing the game. Chrome, Edge and such can also take upwards of 400+ mb vram, and even more if they leave alot of tabs open. I've been trying to communicate the stuff about hardware accerelation for a year now, 8 GB owners simply have to turn off their browsers or if they really want to browse while gaming, either use Steam's browser or turn off hardware accerelation. Same for Discord.


I still advocate for better textures on medium too. But rarely I see a game gracefully scale back to lower VRAM amounts. There are outliers like Forza Horizon 5 and Doom Eternal, but majority of games suddenly start to show you PS2 like textures only to save up 1 GB worth of VRAM. Maybe you should ask a developer why is that. Why they cannot scale down textures properly for lower VRAM usage.
Very odd how you are not experiencing hitches when you go above the menu threshhold like most other users report and we have also replicated on our end. My Windows has 400 MB usage pre game start.

In the video we have coming out, I change the settings to go above the threshhold in the menu and am immediately confronted with multiple stutters.
thing.png

Are you certain you are seeing no hitches at any point at all in your gameplay? Or are you just shrugging them off as "part of the experience"?

edit: your frame-time graph in your screenshot shows perturbations which look like hitches.
 
Last edited:
Yeah, I played the whole game, and I remember those stuttery sections (mainly the soldier camps).

Yeah, I always do.

Now we are on the same page, scalability has gone to hell.


Yeah, nice improvement.

However the GPU shader difference here is HUGE, never seen any game do that. Why is that even possible! I am now curious about Arc GPUs.
AMD shader cache is probably in another location
 
Very odd how you are not experiencing hitches when you go above the menu threshhold like most other users report and we have also replicated on our end. My Windows has 400 MB usage pre game start.

In the video we have coming out, I change the settings to go above the threshhold in the menu and am immediately confronted with multiple stutters.

Are you certain you are seeing no hitches at any point at all in your gameplay? Or are you just shrugging them off as "part of the experience"?
I'm generally using a frame cap of 50. I don't notice a huge amount of hitches that way. Have you tried disabling steam's hardware accerelated web views? 7600 mb is a huge task, but with 7200 mb, I see little to no hitches at all.

I only see hitches when I transition from area to area, but I'm also limited hugely by my 16 GB RAM.


Maybe frame capping helps. I cannot reliably record a video because once I try to record any video with any software, VRAM usage creeps back to 7-7.1 GB from 7.5-7.6 GB and game becomes more problematic (it was really had to record this video, as I had to kill explorer.exe to give some VRAM leeway to the recording software. VRAM is simply too much at its limits). However I can produce a video with 7200 MB-game application centric settings at 1440p/DLSS Quality later on.

I'll try recording a video at 1440p!
 
Could it be that due to all the unique assets the streaming system requires a throughout of 'x' amount and in order to guarantee the textures and assets that are required are always in VRAM.

And with the varying levels of CPU performance on PC where receiving a streaming throughput of 'x' is not guaranteed they go overboard and cache textures they won't need for the latter sections of a set location?
 
And there Alex is the issue with TLOU on PC.

It's not it's VRAM requirements, it's that the low and medium options have really bad textures thus forcing people to go with a setting their GPU can't handle to get a decent looking game.

I suppose the question is why are the textures that bad at lower settings?

As the game was made for PS5 did the l textures required to scale decently below what PS5 offers simply not exist and this soupy mess is what they thought was OK?

If the game had released with good looking textures on low and medium would the outrage at how it performs still been as bad?

Is this also a sneak peak at your comparison video 👀

Medium textures look like a low effort to make this game possible on 6GB GPUs in 1080p. PS5 only games dont need anything else than textures for 12GB. So porting a game to the PC means more work or less buyers.
 
Here's 1440p dlss quality and tuned settings (critical textures at high, "visual effects" set to low, because if not, molotovs and fire effects cause a huge frame drop), locked 60 on a very old CPU.

0:50 settings and start of the run


Some settings are ultra, some are medium (resolution based settings). They do not have an effect on VRAM counter, but maybe actual effect in-game.

Ryzen 2700, crap CPU. 16 GB RAM too (lowend in 2023). Recording is also sadly taking critical last bit of RAM and VRAM resources. Without recording, I get even less stutters (they are already very infrequent). It only stutters when transitioning to a new "level" so to speak. There are also "slight" stutters but I really can't feel them through my VRR screen as they're really really microstutters. If I didn't have FPS overlay open, I wouldn't be able to tell it stuttered (aside from the level to level transition stutters which I bet would not happen on 32 GB RAM).

(I also have to leave a note; this and the intro sequence is the hardest to run on my CPU. Other locations after this are way more lighter on my CPU. Sometimes it dips to 50s and 55s here and there but its a Zen+ 2700 lol)


Pd4rfth.png


My idle VRAM usage at 1440p desktop resolution is towards 260 to 350 mb. If you turn off any windows, it will be at its peak idle usage which is what I always to make sure I do not run into VRAM troubles. As I said, I embraced using my PC as if it is a console at this point, because with how limiting 8 GB is, I have no options other than this. I'd prefer destroying multitasking ability rather than using PS2 textures. Scalability would be nice but sadly, I've hardly see any game scale gracefully into lower end VRAM amounts.

Oh by the way, sorry Alex, I thought you referred to 1440p DLSS quality. I did the video with 1440p dlss quality (7260 mb vram). I will give native 1440p a try as well. But it is a hard task, as it requires 7600 mb vram. And once I start recording, 7500 7600 mb usage drops to 7000-7100 and lags start to occur. I will have to see what I can do).
 
Medium textures look like a low effort to make this game possible on 6GB GPUs in 1080p. PS5 only games dont need anything else than textures for 12GB. So porting a game to the PC means more work or less buyers.
Of course. But the problem is that the gap in quality between medium and high is too large. There has to be a step between "That's grass" and "Is that green mashed potatoes?".
And there Alex is the issue with TLOU on PC.

It's not it's VRAM requirements, it's that the low and medium options have really bad textures thus forcing people to go with a setting their GPU can't handle to get a decent looking game.

I suppose the question is why are the textures that bad at lower settings?

As the game was made for PS5 did the l textures required to scale decently below what PS5 offers simply not exist and this soupy mess is what they thought was OK?

If the game had released with good looking textures on low and medium would the outrage at how it performs still been as bad?

Is this also a sneak peak at your comparison video 👀
This hits the nail on the head. I wish I could like this post twice.
 
Here's native 1440p test. I didn't notice again much of a big stutters aside from certain cutscenes. Gameplay is very smooth on my end. Recording took 250 mb of VRAM, without it game uses around 7500 7600 MB of VRAM and is even more smoother.


Settings change compared to "High" preset;

Positive
Screen spaces shadow quality: From high to ultra (moderate impact)
Dynamic screen space shadows: From off to on (moderate impact)

Negative
Ambient shadows: From half res to quarter res
Spotlights shadow resolution: From high to medium (says it has impact on VRAM)
Point lights shadows resolution: From high to medium (says it has impact on VRAM)
Directional shadow resolution: From high to medium (says it has impact on VRAM)
Volumetric effects: From high to low (saves VRAM)
Visual effects texture quality: From high to low (saves VRAM)


Mixed bag
Depth of field: From cinematic+gameplay to cinematic (I don't like dof in gameplay. it also reduces VRAM a bit too. so all the more good reason I disabled it for gameplay)
Motion blur: From on to off (i dont like mblur)
Animation quality: From high to low (for my CPU)

Overall I feel like these do not compromise much of an impact on overall image quality. I'm still not sure if they have an impact on VRAM actually or not, as they do not change the VRAM bar.

(I think my 3070's rasterization performance is tanking due to VRAM spillage into regular RAM compared to your 2070 super! Very similar to Spiderman.)
 
Last edited:
Very odd how you are not experiencing hitches when you go above the menu threshhold like most other users report and we have also replicated on our end. My Windows has 400 MB usage pre game start.

In the video we have coming out, I change the settings to go above the threshhold in the menu and am immediately confronted with multiple stutters.
View attachment 8623

Are you certain you are seeing no hitches at any point at all in your gameplay? Or are you just shrugging them off as "part of the experience"?

edit: your frame-time graph in your screenshot shows perturbations which look like hitches.
I'm not necessarily saying I'm getting a flawless "no stutter at all" experience. Sorry for poor choice of words. I'd say %98 of the time it is smooth. There are hitches here and there when a new map is loaded or a new car shows up in the scene but it is definitely not constant enough to deem it problematic or immersion breaking. They definitely do not happen with higher end systems with more memory. But from your example, it even happens with medium textures, but they do happen even more rarealy on my end with high textures. So could be that my preventive measures in disabling steam's hardware accerelation and restarting dwm.exe may have a part in play to get a bit more smoother experience.

In my case, a framecap also eliminates problems in a major way. 50 FPS cap especially. Giving GPU some headroom seem to have a positive effect.
 
I agree, but I don't remember playing Ryze Son of Rome or Titanfall on my GTX 660Ti (3GB) and experiencing any VRAM problem, texture setting were pretty scalable visually back then. It never was this extreme.
That’s not an open world game though
 
But one is a very tiny space with very limited and discrete levels and enemies etc. the other is not.

The challenges of today don’t apply to back then.

There are many, many places in TLOU that are in a 'very tiny space' and the largest arena's aren't really that much bigger than what's in Ryse.
 
There are many, many places in TLOU that are in a 'very tiny space' and the largest arena's aren't really that much bigger than what's in Ryse.
Yea. I don’t disagree. That’s why I think their VT system is busted. Something is busted in that game.

I don’t disagree things could have been made to be more PC architecture friendly.
 
But one is a very tiny space with very limited and discrete levels and enemies etc. the other is not.

The challenges of today don’t apply to back then.
No, both are. TLOU is also full of tiny spaces. I'd even argue that the first level of Ryse is larger than anything in TLOU Part I.
 
Status
Not open for further replies.
Back
Top