Digital Foundry Article Technical Discussion [2024]

I feel it's worth to be as clear as possible when describing the frustration some gamers may have at the current state of optimization, and Alex does that pretty well here (timestamped):
Yes he does indeed.

Yes, game development is eminently complex and is often a tortuous process - which is why it's so frustrating to see the player yanked out of all that effort poured into artwork and story crafting when the delivery of those assets is so fumbled. I can recognize the immense talent it takes to create these experiences, but also recognize that at some point, priorities were at the very least, misallocated.

I get that some people might not like how some of us come off when talking about this stuff.. but the time for sugarcoating things is at an end. You absolutely hit the nail on the head with this.

It's like where are the priorities here? Something being more complex does not give it an excuse to not function as should be expected. We're in this mess because of complacency on BOTH sides.
 
Something to keep in mind is that while hardware is better now not all aspects of hardware has scaled proportional. Operationally the throughput of CPUs and GPUs for instance have improved at a rate completely deattched from the latency of moving data. That latency has arguably regressed in some cases (as a tradeoff). And latency is a real problem when we are discussing essentially the just in time and real time operations that gaming needs.

There's also another angle to consider in terms of hardware as well in that leveraging faster hardware to making game development essentially more accessible on the development side at the expense of "optimization" is a good thing for the industry and player base as whole. Not every developer needing a team of "John Carmack" or equivalents to make a game is good a thing for the industry and gamers as whole.

I know from the idealism standpoint everyone wants it all but realistically I'd rather have more developers making more games and pushing the scope and new things at the expense of optimization then a scenario of fewer conservative games that are optimized.
 
Something being more complex does not give it an excuse to not function as should be expected.

Exactly. The solution for increased complexity is some combination of better talent, more money, more time or reduced ambition. The solution is not for paying customers to accept a low quality product and make excuses for the seller.
 
Exactly. The solution for increased complexity is some combination of better talent, more money, more time or reduced ambition. The solution is not for paying customers to accept a low quality product and make excuses for the seller.
Yep. I say it time and time again. What other products do people use where if it doesn't work as intended they reason with themselves saying "you know, I bet it's really hard to make and design something like that... I'll let it slide"...

Never.

They aren't releasing games so people can appreciate how complex game development is.... they develop games for the player to have fun. Stutters and other shit happening reduces that.
 
There's also another angle to consider in terms of hardware as well in that leveraging faster hardware to making game development essentially more accessible on the development side at the expense of "optimization" is a good thing for the industry and player base as whole. Not every developer needing a team of "John Carmack" or equivalents to make a game is good a thing for the industry and gamers as whole.

True, I'm sure there are plenty of games we wouldn't have seen without the emergence of engines like UE4 and purchasable off the shelf materials outlets. I've purchased and enjoyed a lot of games that were made by smaller and emerging studios that ultimately had solid production values that likely wouldn't have been possible otherwise.

However, then that just places more of the onus on the experienced engine developer to assist/educate these more neophyte developers in critical areas of performance, especially the ones that will be paramount in every game, regardless of the design. It should have required extra effort on the developers side to avoid having their game ship without a shader precompilation stage instead of the other way around! If you design and market your product to be used by those without extensive experience, then it's on you to provide the guardrails, and sound the klaxon when a game is being built with those guardrails removed.

Another example of this I think might be the Unity engine. Its early problem of the 50hz update tick rate is blatantly obvious to me and most here, and its hard to imagine games taking years in development and no one thought "hey, why does my game have a constant judder?" - but plenty of them still shipped as such. Otoh, does it have that rep for shader stutter? I don't own that many Unity games but the ones I do - the Ori Series, Hollow Knight, Cuphead, Yooka Laylee - none have shader stutter, and that includes the ones that shipped with that 50hz issue (Disco Elysium, Dave the Diver). So presumably Unity, for all its faults, seems to be able to emphasize the importance of shader compilation automatically, or at least better than UE4 did.
 
Last edited:
Games get more complex every year. If modern games used the limited material systems of the past, maybe we wouldn't see the same issues.

Don't have much to disagree with your post (albeit I think "git gud" may be a little bit of an oversimplification of those who are frustrated perhaps).

On this though point in particular though: Sure, the complexity of materials and thus shader length has significantly increased - but even in the DX9 days, shader precompilation was still required, and most games did indeed do it! Remember that you had far weaker CPU's at the time as well, but games were precompiling all of the place - it was just stuck in loading screens, if the step wasn't made explicit (like Call of Duty).

Some games didn't, but they were still relatively rare and most gamers didn't know what shader compilation stuttering was regardless so they're not held up as bad examples. One such game series that did have it pretty prominently - Batman! (UE3, hmm). Arkham Asylum had a little bit of it, but Arkham City and Arkham: Origins have a load of it on a fresh install, and if you look back at forum threads on those games at the time, people of course were full of bogus recommendations to 'fix it' - but the cause of a lot of it was indeed shader stuttering, which is why you can get near flawless performance on all of them by using DXVK with asynchronous shader compilation.

One game in the series doesn't have it - Arkham Knight. Plenty of other performance issues of course, but it's the one game in the series which actually precompiles shaders on a new install, funny enough.
 
Last edited:
I'm not a developer but I work closely with teams that have them. I don't often see a reason to blame developers for sloppy work or cutting corners. That's a function of project management and product management and the tiers above that.

Lack of resources, unrealistic deadlines, bad internal communication, business priorities, these are rarely directly under the purview of the actual developers. At the end of the day, if a game ships in an unpolished state, someone either did not bother to verify that it was really ready, or they weighed the pros and cons and decided to push ahead regardlessly. In most cases that someone is something other than a developer.
 
How often do we see shader compilation addressed *after* the fact these days... still.. only once a shitstorm has been raised..

Just because games have traversal and shader compilation stuttering doesn't mean that it's because things are "too complex" to not have the issue.. I maintain that most of it is a lack of adequate QA process, or an issue with time, or priority. You will never be able to convince me that at least some developers/publishers out there aren't like "F-it, ship it anyway it's good enough".. It undoubtedly happens for a multitude of reasons.

OF COURSE developers are going to tell you it's a complexity issue, game dev is difficult.. ect ect... of course there's always going to be some excuse. But that doesn't change what the basic expectations of a game are. We can all understand high fidelity games running at lower resolutions, lower framerates... but a game is simply not supposed to hitch and stutter as you move through it. No developer will ever say that's part of the design. We know that... so thus our expectations should be set as such.
 
I'm not a developer but I work closely with teams that have them. I don't often see a reason to blame developers for sloppy work or cutting corners. That's a function of project management and product management and the tiers above that.

Lack of resources, unrealistic deadlines, bad internal communication, business priorities, these are rarely directly under the purview of the actual developers. At the end of the day, if a game ships in an unpolished state, someone either did not bother to verify that it was really ready, or they weighed the pros and cons and decided to push ahead regardlessly. In most cases that someone is something other than a developer.
Project managers are developers.

All of those reasons... are perfectly valid reasons for issues in games.... however it's absolutely our right to call out why BASIC functionality is being compromised before anything else.
 
We've always had titles that gamers felt were poorly optimized for the target hardware in the past, yes. But this issue of stuttering is indeed relatively new, at least at this frequency that we're seeing in modern titles. This is a distinct performance phenomena separate from frame rate drops. Hell, with the proliferation of VRR displays, frame rate drops are probably less impactful than ever before, but stutters - pauses in frame delivery of 50->300ms or more - are actually relatively rare in older titles despite the 'average' benchmark of those games likely being far below modern titles even on middling rigs. It's a new level of immersion disruption.
It's worth noting that this is primarily because older games just... didn't stream much at all. Ultimately a lot of stutter is due to streaming "things"... textures, shaders, meshes, worlds, even game logic. As part of the push for both larger and higher fidelity games, it is no longer possible for these games to just load a level into memory and sit on it until the next load screen. The push to eliminate load screens has also complicated matters, even for relatively linear games. When you go back to the 90s games you do indeed ditch the stutter but you get the load screens back.

None of this is to say that stuttering is acceptable. Nor are - IMO in 2024 - load screens (hi Starfield). But to frame this as "90s games had solved this problem and we somehow forgot how to do it" is misunderstanding the issues; 90s games simply did not have these problems because their scope was much smaller. It's not that they had magical streaming systems that worked great, they simply didn't really have streaming systems at all.

I think we're likely (hopefully?) at a local minimum of a bunch of this though. Folks are transitioning from older style baked, statically loaded stuff to newer much more dynamic systems and streaming, but don't necessarily have a lot of the experience yet necessary to tune (or even in some cases, use) the new systems. Ex. I am flabbergasted every time a new game ships without virtual textures. Texture streaming is basically solved (and no, we don't need sampler feedback ;))... just turn it on. Nanite pretty much solves geometry/mesh streaming, although simpler systems can work fine for that too. PSOs are still a bit of a mess but a workable and understood mess I think. Raytracing is probably the biggest offender as it's just not possible to do efficient fine-grained BLAS/TLAS streaming on PC right now.

That said, I'd wager that the majority of stutter issues that are still happening with games that understand how to use the above features are actually from the gameplay and physics side. In Unreal you can get into a lot of trouble if you just naively use blueprints and spawn and destroy actors all over the place. This is nothing new, and more experienced folks know how to deal with it via more native C++ custom gameplay systems, but newer folks that are heavier on the art department can certainly fall into traps in this area. These systems have seen some improvements at an engine level but fundamentally stuff like blueprints are never going to be appropriate for things with 1000s or 10's of thousands of instances. Systems like MASS (as used in CitySample for the traffic and crowds and so on) are more meant for those kinds of cases, but I would generally recommend and expect AAA folks to roll their own stuff at this level, because gameplay needs and systems are obviously very game dependent.

Anyways the tldr is I think it's completely legit to call it unacceptable to have persistent stutters and other interruptions, although like anything sometimes the hyperbole goes a bit far. But certainly when comparing to past games it's not that previous games had solutions to this stuff, they just did not target the scope required to even encounter the problems. Folks making smaller games with more linear, static content also don't tend to run into these "massive open world" kind of issues to nearly the same extent.

We can all understand high fidelity games running at lower resolutions, lower framerates... but a game is simply not supposed to hitch and stutter as you move through it.
I mean it's not supposed to run and look terrible either, at least as long as your hardware hits the minimum requirements. Stick to your guns here - as a consumer it's always fine to call out stuff that you don't think lives up to your required standards; you no more need to make apologies for framerates than stutter (particularly on consoles).
 
It's worth noting that this is primarily because older games just... didn't stream much at all. Ultimately a lot of stutter is due to streaming "things"... textures, shaders, meshes, worlds, even game logic. As part of the push for both larger and higher fidelity games, it is no longer possible for these games to just load a level into memory and sit on it until the next load screen. The push to eliminate load screens has also complicated matters, even for relatively linear games. When you go back to the 90s games you do indeed ditch the stutter but you get the load screens back.

None of this is to say that stuttering is acceptable. Nor are - IMO in 2024 - load screens (hi Starfield). But to frame this as "90s games had solved this problem and we somehow forgot how to do it" is misunderstanding the issues; 90s games simply did not have these problems because their scope was much smaller. It's not that they had magical streaming systems that worked great, they simply didn't really have streaming systems at all.

I think we're likely (hopefully?) at a local minimum of a bunch of this though. Folks are transitioning from older style baked, statically loaded stuff to newer much more dynamic systems and streaming, but don't necessarily have a lot of the experience yet necessary to tune (or even in some cases, use) the new systems. Ex. I am flabbergasted every time a new game ships without virtual textures. Texture streaming is basically solved (and no, we don't need sampler feedback ;))... just turn it on. Nanite pretty much solves geometry/mesh streaming, although simpler systems can work fine for that too. PSOs are still a bit of a mess but a workable and understood mess I think. Raytracing is probably the biggest offender as it's just not possible to do efficient fine-grained BLAS/TLAS streaming on PC right now.

That said, I'd wager that the majority of stutter issues that are still happening with games that understand how to use the above features are actually from the gameplay and physics side. In Unreal you can get into a lot of trouble if you just naively use blueprints and spawn and destroy actors all over the place. This is nothing new, and more experienced folks know how to deal with it via more native C++ custom gameplay systems, but newer folks that are heavier on the art department can certainly fall into traps in this area. These systems have seen some improvements at an engine level but fundamentally stuff like blueprints are never going to be appropriate for things with 1000s or 10's of thousands of instances. Systems like MASS (as used in CitySample for the traffic and crowds and so on) are more meant for those kinds of cases, but I would generally recommend and expect AAA folks to roll their own stuff at this level, because gameplay needs and systems are obviously very game dependent.

Anyways the tldr is I think it's completely legit to call it unacceptable to have persistent stutters and other interruptions, although like anything sometimes the hyperbole goes a bit far. But certainly when comparing to past games it's not that previous games had solutions to this stuff, they just did not target the scope required to even encounter the problems. Folks making smaller games with more linear, static content also don't tend to run into these "massive open world" kind of issues to nearly the same extent.


I mean it's not supposed to run and look terrible either, at least as long as your hardware hits the minimum requirements. Stick to your guns here - as a consumer it's always fine to call out stuff that you don't think lives up to your required standards; you no more need to make apologies for framerates than stutter (particularly on consoles).
Yea I really do understand that it's not so clear cut.. however my issue is when games clearly don't seem to consider it properly and design the game around it to an extent. Those loading screens of the past were literally THE consideration for it.. lol. Like big traversal stutters right within the center of a zone where you're fighting enemies or something shows no consideration. There's lots of games which will design their levels around those issues so that they don't happen directly when you're actively engaging the game.

Like I say.. I know it comes off brash most of the time. Most gamers realize it's not realistic to expect something to be flawless.. there's always going to be compromises, but we also have to have a basic standard of expectations. Like with shader comp stuttering... a small hitch here or there isn't a massive deal... but having a hitch everytime you see something or do something new.. nah. It's the same with traversal hitches. If I'm running between two big areas and the game thoughtfully plans out when that stutter is going to happen.. then I'm cool with it. Yep, it's loading. Yep it's saving. No problem. But inside of an area where I'm enaging enemies? That's not good. In Dead Space Remake it's damn near unbearable as enemies chase you through loading zones which are constant.. and it just feels terrible.

I have no doubt in my mind that it doesn't have to be that bad.. it just is.. for reasons.
 
I have no doubt in my mind that it doesn't have to be that bad.. it just is.. for reasons.
Oh yeah to be clear I don't think anyone would claim these are unsolvable problems or anything like that. Even in the context of big open worlds it's mostly a matter of careful feature planning and time for proper polish. I'm merely providing the context that the severity of these problems has increased mostly because of the scope increase, not because of people somehow forgetting what to do. Perhaps obviously, it requires significantly more time and resources to plan for and solve general streaming problems in a big open world game than it did to have some load screens in a linear one in the 90s.
 
If we're talking about addressing it from the design and scope perspective the solution it seems, at least for the time being, could be to reduce scope (essentially) as you propose. But is that the preferred solution for everyone? I feel consumer choice should apply here as well, some people can pick more conservative highly polished games while others can pick more estoeric (basically) less polished games, and in by polish I don't just mean performance either.

I think it's worth remembering that there's been plenty of past favorite games that people have loved that were far from polished. Since Stalker is kind of in the discussion now, do some of you remember how those games played (without mod fixes as well) for the hardware of the time?

Since the streaming issue was brought up it's worth circling back also how much of a problem this is from non uniform advances in hardware. Memory for example has not scaled remotely close to compute. x16 scaling in memory to 128GB this generation could likely alleviate technical challenges on the software and maintain design scope, but what the industry has to work with is 16GB.
 
I think it's worth remembering that there's been plenty of past favorite games that people have loved that were far from polished. Since Stalker is kind of in the discussion now, do some of you remember how those games played (without mod fixes as well) for the hardware of the time?
Yep, I think it's fine that both exist. I'll cite that jank that is basically every Bethesda game as an example of stuff that many people look past the technical issues because the gameplay is worth it for them. This may be sacrilegious in a DF thread but I'll also note that a lot of my friends played through Jedi Survivor and loved it and just didn't notice any of the issues that have become a bit of a meme lately. I played through it too and while Koboh was a bit rough in places the majority of the game was honestly fine (on my admittedly high end rig). 🤷‍♂️ And do we really want to be a community that tells people they are wrong for enjoying something? I certainly don't.

Conversely there's a large number of nice polished AAA games that I just don't care for the gameplay of of course. So technical polish is something I generally value a lot, but it's not the only thing that matters. That said, like any other aspect of a game it's completely legit to give feedback on it.
 
Oh yeah to be clear I don't think anyone would claim these are unsolvable problems or anything like that. Even in the context of big open worlds it's mostly a matter of careful feature planning and time for proper polish. I'm merely providing the context that the severity of these problems has increased mostly because of the scope increase, not because of people somehow forgetting what to do. Perhaps obviously, it requires significantly more time and resources to plan for and solve general streaming problems in a big open world game than it did to have some load screens in a linear one in the 90s.
I get you. That's basically why I give the response I do when people bring up the complexity excuse. Like, I could plan to build a house, mess up..go over budget, and then cut corners to finish it too... This is the issue. You best make sure your foundation is solid before you go expanding on it.

I get that people want to push boundaries and that it comes with compromises.. but you have to make the right choices if you have to choose.
 
That latency has arguably regressed in some cases (as a tradeoff). And latency is a real problem when we are discussing essentially the just in time and real time operations that gaming needs.
Can you elaborate on this further? what do you mean by latency here? memory latency?
 
Yep, I think it's fine that both exist. I'll cite that jank that is basically every Bethesda game as an example of stuff that many people look past the technical issues because the gameplay is worth it for them. This may be sacrilegious in a DF thread but I'll also note that a lot of my friends played through Jedi Survivor and loved it and just didn't notice any of the issues that have become a bit of a meme lately. I played through it too and while Koboh was a bit rough in places the majority of the game was honestly fine (on my admittedly high end rig). 🤷‍♂️ And do we really want to be a community that tells people they are wrong for enjoying something? I certainly don't.

Conversely there's a large number of nice polished AAA games that I just don't care for the gameplay of of course. So technical polish is something I generally value a lot, but it's not the only thing that matters. That said, like any other aspect of a game it's completely legit to give feedback on it.

Play Ratchet and Clank Rift Apart if you haven't! It's the same exact genre as Survivor, and while I had a perfectly decent time with Survivor, Rift Apart was the same amount of gameplay, story, and variety compacted down into 8 hours instead of 30 and it's an absolute blast start to finish, I felt like I reached the final boss before I knew what was happening, and well before I even considered wanting it to end.

That being said, I'd say Bethesda, topic at hand, actually did manage to overreach their decade+ of underinvestment in tech with Starfield. The game has a myriad of problems stemming from the lead design down, but even if a better lead were there some of the fundamental tech problems like being entirely unable to convey a spaceship actually landing on a planet would've held it back regardless. "Fast travel menu the game" is not the concept people have in mind when "explore a space opera universe" is what's sold to them. And while the Elder Scrolls 6 won't encounter such problems and could be fun either way, I do hope for the devs sake at Bethesda that they've switched to UE5. At this point their "Creation Engine" is a relative liability from a useability standpoint when trying to get a studio of 300+ people to turn out a "AAA" open world game compared to better tools.
 
Yep, I think it's fine that both exist. I'll cite that jank that is basically every Bethesda game as an example of stuff that many people look past the technical issues because the gameplay is worth it for them. This may be sacrilegious in a DF thread but I'll also note that a lot of my friends played through Jedi Survivor and loved it and just didn't notice any of the issues that have become a bit of a meme lately. I played through it too and while Koboh was a bit rough in places the majority of the game was honestly fine (on my admittedly high end rig). 🤷‍♂️ And do we really want to be a community that tells people they are wrong for enjoying something? I certainly don't.

Conversely there's a large number of nice polished AAA games that I just don't care for the gameplay of of course. So technical polish is something I generally value a lot, but it's not the only thing that matters. That said, like any other aspect of a game it's completely legit to give feedback on it.as

One of the all-time most popular games on Steam had its peak when it was janky and buggy because it was fun. Of course I am talking about PUBG. Still heavily played (not so much in North America) I think six years later?

I can tell you I’m way more sensitive to stuttering than the group I play with. I’m always optimizing, trying to get the smoothest frame times and they just play. One guy has a nearly identical pc to me and will say a game is running well, but I’m always checking performance and can see when it’s not.

Personally I will wait on games that I know have poor performance or issues that will bother me. Even for me PUBG was an exception. We have a lot of inside jokes about our time on that game.

I hope there’s a shift in the industry to handling common issues. Maybe DF will get us there. I do also remember tons of games with terrible performance and problems since I started gaming on pc in the early 90s. I think people are overstating the general quality of older games. I do think the complexity of modern games is a reason and not an excuse.
 
I do hope for the devs sake at Bethesda that they've switched to UE5. At this point their "Creation Engine" is a relative liability from a useability standpoint when trying to get a studio of 300+ people to turn out a "AAA" open world game compared to better tools.

I've difficulty accepting that statement when UE4 RPGs don't exactly blow Creation Engine's skirt off. Not sure we have a UE5 example yet, with it's better support for open world titles?
 
Can you elaborate on this further? what do you mean by latency here? memory latency?

Likely referring to memory latency. Cache hierarchies have built up in cpus because memory latency hasn't kept pace with cpu frequencies. The relative cost of a cache miss is going up over time in terms of clock cycles lost for doing actual work. It could be in Nathan Gregory's Game Engine Architecture book where there's a chart demonstrating how they diverge over time. So it's this weird problem where game data sets are getting bigger and bigger, and transforms are getting more and more complex, but the penalty for waiting on a read from memory is also going up and up.

Not that it's an overall explanation for all kinds of stutters from shader compilation or level streaming.
 
Back
Top