Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
Dev can do better but ISV too and it begins with a 16 GB 4060Ti ...

A pointless GPU that will just run of puff well before it runs out of VRAM.

I for one do not want AMD and Nvidia to fix the issue with games being poor at managing VRAM by just chucking more of it on to their GPU's and passing on that cost to the consumer.

It will only make developers give less of a shit as they will have enough VRAM to just be sloppy instead of actually trying to do a good job of managing memory.
 
Last edited:
I think that console users having tasted 60fps standardized modes will definitely be unreasonable about devs having a breaking point. It's an "expectation" now.

99% of consoles gamers are casuals who couldn't care less about 30fps vs 60fps, they just play the game and enjoy it.

The console gamers on social media screaming about 60fps being expected are the very small minority.
 
A pointless GPU that will just run of puff well before it runs out of VRAM.

I for one don't not want AMD and Nvidia to fix the issue with games being poor at managing VRAM by just chucking more of it on to their GPU's.

It will only make developers give less of a shit as they will have enough VRAM to just be sloppy instead of actually trying to do a good job at managing memory.
With more developers moving to UE5 as this generation progresses, the VRAM situation will far more worse as Nanite and RT are VRAM heavy in addition to the current high VRAM requirements to have high quality textures.
 
With more developers moving to UE5 as this generation progresses, the VRAM situation will far more worse as Nanite and RT are VRAM heavy in addition to the current high VRAM requirements to have high quality textures.

And yet The Matrix UE5 demo ran perfectly fine on my 8GB 3060ti with said 3060ti running out of puff well before it hit VRAM limits.
 
With more developers moving to UE5 as this generation progresses, the VRAM situation will far more worse as Nanite and RT are VRAM heavy in addition to the current high VRAM requirements to have high quality textures.

UE5 uses virtual texturing which is far more forgiving on memory requirements than traditional texturing systems. It should make more efficient use of VRAM, not less.
 
Can't wait to see UE5 being used more widely, to shut up those 8GB VRAM haters.

The criticism of expensive 8GB GPUs is valid. That’s a whole different issue though to games needing more than 8GB for no good reason. I’m all for games making 8GB obsolete if it means a big jump in IQ. The recent hype unfortunately seems to be less about IQ increases and mostly about lack of optimization.
 
Aren't the UE5 texture and geometry savings just going to be sucked up by lighting, never mind whatever other engines are doing?
 
Last edited:
I would offer that graphical options for consoles games isn't a wholly new things. A bunch of OG PlayStation games had options for this, including G-Police where you have settings for draw distance and frame rate. As current generation consoles support display output at 1080p, 1440 and 4K, games having options for quality vs performance were an inevitable development.

I don't see this trend reverting.
Almost all, if not all, of the N64 games that supported the Expansion Pak also had user adjustable graphics modes. World Driver Championship had 2 graphics mode even though it didn't support the Expansion Pak. Outrun on Saturn and Sega Rally 2 on Dreamcast also had graphics options.
 
As are the vast majority of game reviewers. Just look at how many gave Tears of the Kingdom a 10/10 or 9/10 when it has so many times where it dips into the extremely low fps range (15 fps?).

Jedi Survivor was the same. Some reviewers explicitly acknowledged that their scores didn't reflect the performance issues either because they weren't aware (only tested PS5 @ 30fps) or because they didn't think it was a big deal / will be fixed eventually. There's some merit to that since as a consumer you don't want review scores that don't accurately reflect the quality of the game if you happen to play it some time after launch.
 
The criticism of expensive 8GB GPUs is valid. That’s a whole different issue though to games needing more than 8GB for no good reason. I’m all for games making 8GB obsolete if it means a big jump in IQ. The recent hype unfortunately seems to be less about IQ increases and mostly about lack of optimization.
Note that going from 8GB to 12GB is no different from 4GB to 6GB GPUs. It's not like there was a big visual difference between GTX 980 and 1060 in games. 50% just isn't that big of a difference.

Wii U similarly more than doubled memory over PS360. But I don’t think many would say the games looked very different or that Nintendo didn’t optimize their games for it.
 
I find it hilarious that some of us were getting ridiculed for saying the exact same thing just a few weeks back and the PC diehards were up in arms. Now that Andrew here and Oliver from DF are providing the same message, it seems value of contribution is much more dependent on the messenger as opposed to the message and it's substance in this forum.

You don't have the same 'message' as Andrew and DF.

You have one drum to bang, but there are two equally valid subjects to discuss:

- The coming need for more than 8GB of vram to reliably manage parity with console on PC (and Nvidia's cynically low vram provision)
- The poor and sometimes shockingly poor ability of games to scale down to 8GB gracefully, due to poor or incomplete planning and optimisation

It's disingenuous of you to claim that Andrew and DF are validating your one sided, and un-nuanced opinion. And here is your opinion again, from the same post as above.

It's about time you all stop blaming developers and "lazy PC ports" and instead amp up the pressure on these vendors who are selling high compute cards with incohesive memory systems.

Both can be issues, and are. To think that software can't be part of this problem is insanely naive. Flawed software can always cause adequate hardware to under deliver. And I don't mean "adequate to match console" I mean adequate to deliver substantially better than it is doing. In any industry.

And putting pressure on vendors to provide more vram is all well and good, but it doesn't it doesn't change the fact that software launching into the market today and for the next few years will be launching into a market where the biggest target vram amount will be 8GB. In addition to legacy cards, the 4060 Ti 8GB and the 7600 XT 8GB will be big sellers.

And I say this as someone who (personally, I'm not saying everyone should agree) won't even buy a second hand 8GB RTX card, and whose next card will probably be 12 or 16 GB (but no less than 12).
 
I would offer that graphical options for consoles games isn't a wholly new things. A bunch of OG PlayStation games had options for this, including G-Police where you have settings for draw distance and frame rate. As current generation consoles support display output at 1080p, 1440 and 4K, games having options for quality vs performance were an inevitable development.

I don't see this trend reverting.
I had no idea GPolice did that. But it is a grand exception for sure. The different graphical options were becoming a standard feature after the mid gen upgrade.
 
You enjoy most games much more if they have higher frame rates. This has been settled.
Most of my friends play more than me and don't care...i even think they can tolerate 20fps since they still play games in 2023 at ultra settings on 970/980 in 1080p because it runs fine for them
 
Most of my friends play more than me and don't care...i even think they can tolerate 20fps since they still play games in 2023 at ultra settings on 970/980 in 1080p because it runs fine for them
20 fps is not something that "runs fine". If it "runs fine for them" they have pretty low standards, but for most people I'd like to think that 20 fps is totally unacceptable.
 
20 fps is not something that "runs fine". If it "runs fine for them" they have pretty low standards, but for most people I'd like to think that 20 fps is totally unacceptable.

You like to think, but most people probably do not know what they are experiencing. They might notice it, but do not have the vocabulary nor the experience to express it.
 
Status
Not open for further replies.
Back
Top