Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
At playable settings 8GB was useless for a very long time unless the user purposefully did something to the game that made need 8GB VRAM.

It certainly took a while before it started to deliver. 2019 / 2020 was probably when it started to show a benefit beyond outliers, and obviously cross gen games started to change the landscape. I know many gamers would have moved on to newer cards by then, but I tend to keep my hardware a long time and donate it when I'm done - my GTX 560 1GB is still in use in a friend's computer!

I can totally see why at launch of the 570 / 580 many people concluded that the 4GB variants were optimal for them. GTX 1060 6GB was a good compromise, and a pretty solid card all round. I'd have been very happy with one.


That's such a specific and small use case that I feel it's irrelevant.

To a lot of people it probably is, but for me the capability is a nice bonus.

VRAM is and always has been a balancing act, we've had periods in time where AMD haven't given their GPU's enough VRAM to complement their GPU core performance (HD5800 series)

And there's also been instances where Nvidia haven't had enough VRAM for their GPU core performance (GTX670/680)

And then there's bad ports, is it fair to use those games to gauge if a card has enough VRAM? Or do you go with what the other 99% are using?

The amount of AAA games in 2023 that require more than 8GB VRAM at 1080p, at playable settings you can count on your fingers.

8GB is fine for 1080p with playable settings, 8GB is fine for 1440p in 99% of games at playable settings.

I agree on pretty much all these points. I do think vram requirements will continue to drift up over time for matched console settings, and that 1440p will become a bit tight on 8GB, but absolute stinker ports that look like shit with 8GB vram are a developer side problem.
 
As the available power increases, we should move to reduce the asset duplication and it’s worth it. It’s very immersion breaking to fight the same looking baddies for the entirety of a 30 hour game. It’s completely unrealistic and is detrimental to the believability of the game world. It’s a compromise we should look to eradicate. If 8gb of vram has to die as a result, let it die. In terms of image quality, developers are already using DLSS and similar technologies as a crutch so I don’t think much will change in terms of image quality.
That's a legitimate opinion. And of course 8GB cards are going to be put to pasture. That is the way of things. The problem I have now is that I find games at their highest settings to be only marginally more impressive than ones from just a few years ago, especially when you look at simply the texture and asset quality, but the image quality is so much worse on settings just below the top. We are definitely reaching a point of diminishing returns, but a few years ago setting a game to high instead of ultra meant only slight differences in image quality and a fair bump in performance, even when comparing cards with excess vram. Now the image quality differences are greater and the performance gains lower.
 
Diablo 4:
I am getting a lot of stuttering at 4KDLSS on my 3070. But no issues at 1080p.

So… sad :(
 
Sounds like an issue, or perhaps you just need to tweak a couple things. DLSS Quality at 4k is what, 1440p base res, right? A 3070 should have zero issues with that:

It a bad setting on blizzard, if I turn on DLSS, I can not set resolution. I could turn it off and just go for 1440p however. I tried running DLSS balanced and I wasn’t getting rid of the periodic stuttering.

For whatever reason I’m getting a lot of
Hitches, hopefully DF and team will have an answer. 1080p runs smoothly in comparison.
 
It a bad setting on blizzard, if I turn on DLSS, I can not set resolution. I could turn it off and just go for 1440p however. I tried running DLSS balanced and I wasn’t getting rid of the periodic stuttering.

For whatever reason I’m getting a lot of
Hitches, hopefully DF and team will have an answer. 1080p runs smoothly in comparison.

Their costuming system uses a lot of detail. For example if you have a leather skirt that has chainmail around it using large rings you have the base leather skirt and then you have an individual chainmail object layered on top of it. It's not just a leather skirt with a chainmail texture. This seems an odd choice as you can't really zoom in far enough to really see this outside of the character selection screen.

Due to them attempting to make this a bit more social/MMO like, that means that either they need to keep a LOT of textures and character outfit objects in memory or it has to stream them in whenever a new player enters within a certain screen distance from your character.

I could certainly see this putting enough memory pressure on 8 GB cards such that increasing resolution above 1080p might introduce stuttering due to streaming in another player and then world detail spills over into main memory. DLSS on top of that would just make the situation potentially even worse for 8 GB cards.

Have you tried to see what it does with just reducing texture resolution and world geometry detail to lowest possible rather than reducing resolution?

Regards,
SB
 
Il
Their costuming system uses a lot of detail. For example if you have a leather skirt that has chainmail around it using large rings you have the base leather skirt and then you have an individual chainmail object layered on top of it. It's not just a leather skirt with a chainmail texture. This seems an odd choice as you can't really zoom in far enough to really see this outside of the character selection screen.

Due to them attempting to make this a bit more social/MMO like, that means that either they need to keep a LOT of textures and character outfit objects in memory or it has to stream them in whenever a new player enters within a certain screen distance from your character.

I could certainly see this putting enough memory pressure on 8 GB cards such that increasing resolution above 1080p might introduce stuttering due to streaming in another player and then world detail spills over into main memory. DLSS on top of that would just make the situation potentially even worse for 8 GB cards.

Have you tried to see what it does with just reducing texture resolution and world geometry detail to lowest possible rather than reducing resolution?

Regards,
SB
lll try again and see what it takes to stop the hitching.

Edit; i think it's my PC. So, I take it back, I think it can run. For some reason if I play on just my 4K TV alone, it hitches. But when I extend it from my 1080p monitor as a second screen and run it there, no problems.
 
Last edited:
It a bad setting on blizzard, if I turn on DLSS, I can not set resolution. I could turn it off and just go for 1440p however. I tried running DLSS balanced and I wasn’t getting rid of the periodic stuttering.

For whatever reason I’m getting a lot of
Hitches, hopefully DF and team will have an answer. 1080p runs smoothly in comparison.
Try 1440p. If that fixes the problem then it might just be a busted DLSS implementation.

Definitely should not be happening though and I'd say is very unlikely to be memory related.
 
Likely lazy dev work
Again, nothing outs somebody as completely ignorant of what they're saying more than when they trot out the 'lazy devs' accusation.

It's completely embarrassing how much I see that sort of Youtube comment section-level rhetoric on this forum of all places.

SF6 is by all accounts a supremely technically proficient game. Yet you're accusing the devs of being lazy? smh
 
I'd never call Capcom lazy. I do think it's fair to ask though whether the Series S is being as fully utilised as the other console systems - and I'm not even talking about muscle deformation.

Lets take a look back at the DF preview for SF6.


PS4 runs at 1080p, Pro at 1440p, Series X and PS5 at 4K with some enhancements to boot. Reasonable enough. What about Series S?

It's 1080p. With last gen looking textures, simplified SSR and reduced vegetation compared to other new gen systems. And like last gen, no muscle deformation. But why would the Series S be running at such a low resolution and with cutbacks to quality?

Series S has 8GB of memory for games - so exactly half way between Pro (1440p) and PS5 (4K). So it's probably not that then.

Fillrate? Series X runs at 4K, with 64 (effective, shoot me!) RDNA 2 ROPs. Series S has 32, and with clockspeed differences that's 43% of the pixel fill of Series X. But 1080p is only one quarter of 4K. And Series S has reduced SSR and vegetation to boot which will further save a bit on fill. So even while ROP efficiency drops a bit at lower resolutions with relatively small tris, it's probably not fillrate holding the S to 1080p. Even stock PS4 is holding a solid 60 at 1080p.

Compute? Well, the Pro hits 1440p with 4.2 TF, while the Series S has 4 TF but using RDNA 2, which is supposed to be about 25% more efficient. So it's probably not losing there and it's probably not the reason for running at 56% of the Pro's resolution.

Bandwidth? Series S has the same theoretical BW as the pro, and 50% of the BW of the PS5. In reality, memory controllers and caching have probably improved since the Pro so effective throughput shouldn't be worse. It's actually got better in corresponding parts in the PC space. So that's probably not the reason for running at 56% of the Pro and 25% of the PS5.

So I don't know why the Series S version of this game is a bit underwhelming for the hardware, but to me it is. My guess is that time and priorities meant that it simply paid to use the Series S this way. It's the least important of all the consoles, and the "new gen" system where people are likely to care least about how well the game utilises their hardware.
 
Last edited:
I think they could have done a bit more than parity with PS4.
It's technically got more horsepower than PS4Pro, and that's running higher settings.
They probably could have but not only does the Xbox represent a tiny portion of their market share, the Series S is an even smaller one. At that point, they probably didn't wanna bother and since people buying the console for $299 don't care about resolution or even graphics, there wasn't much sense in allocating many resources to optimizing for it.
 
Series S is an obligation. I don't blame devs for not wanting to work with it besides the bare minimum. It's another sku that isn't really neccesary for them. It's MSs own gambit
 
I understand the points placed, I'm just surprised it would require optimization to get it to 1440p. It seems like there should be enough there.
 
Lazy devs, seriously? More likely lazy MS engineers when they designed the S with that amount of memory.

Nah, the Series S is just too underpowered to bother with.

It's lazy devs ,

Function points out the lazyness of the devs by not pushing the series s further. The series S has a much more powerful zen 2 8 core /16 thread cpu that runs at what over a 1.7ghz faster than the pro and 2.2ghz faster than the base ps4.

So yea lazy devs
 
Status
Not open for further replies.
Back
Top