Digital Foundry Article Technical Discussion [2022]

Status
Not open for further replies.
Things have moved on a lot though. If you were to design a system for the purposes of playing games, a PC does not miss much due to its legacy. It could be more efficient, but then perhaps so could the consoles which have made hardware sacrifices in favour of supporting easier software development. The end result is the consoles are using hardware that's basically a match for PCs, showing it's only the software that's making the difference.
Ehh, I disagree with this. At this point with increasing complexity of games and higher expectations, it's far less efficient for anybody for console pubs to have weird console hw like in the past. Devs and users alike. And I doubt it would even be possible. What kind of bespoke parts are even running with the kind of power off the shelf GPU and CPU components have nowadays?

MSs entree into the market with the og Xbox showed how capable pc hardware in a box could be, and there was no turning back. Nintendo could not keep pace to begin with with the cutting edge of technology and Sony learned despite having impressive first party games after a few years, how much more important it was to not just the developer but the end user to have standardized semicustom parts with virtually every time third party game that ran like crap by comparison to 360. Users could feel that in how devs prioritized it.

It's the most efficient and smartest decision for Sony to drop Ken kutaragi style hw and go x86. Devs hated it, users hated it and on top of that, there was literally no point anymore when only a small percentage of devs on the platform would utilize the smaller benefits of such tech.

So that's why with PS4 and ps5 they prioritized hardware differences in more standardized areas that's could be used by all devs to some degree. Investment in a decent GPU for PS4, SSD tech for PS5 etc. Even if only first parties will use the SSD in their engines fully, it's still something every dev can use without breaking their back and gamers can see what the SSD can do regardless.
 
Last edited:
Ehh, I disagree with this. At this point with increasing complexity of games and higher expectations, it's far less efficient for anybody for console pubs to have weird console hw like in the past. Devs and users alike. And I doubt it would even be possible. What kind of bespoke parts are even running with the kind of power off the shelf GPU and CPU components have nowadays?
I don't understand what you are disagreeing with. Your post is all about bespoke hardware and standard PC parts are better. The quote you are disagreeing with is saying the consoles are PC hardware. The only bit that makes sense to disagree with is the idea that gaming hardware could be better if bespoke, which I explicitly put in a 'perhaps' as that's not the meat of the argument nor a debate worth having IMO. IF more bespoke gaming hardware is possible, it's eschewed anyway in favour of developer convenience, such as Cerny's theoretical 1 TB/s scratch-pad RAM design.
 
I don't understand what you are disagreeing with. Your post is all about bespoke hardware and standard PC parts are better. The quote you are disagreeing with is saying the consoles are PC hardware. The only bit that makes sense to disagree with is the idea that gaming hardware could be better if bespoke, which I explicitly put in a 'perhaps' as that's not the meat of the argument nor a debate worth having IMO. IF more bespoke gaming hardware is possible, it's eschewed anyway in favour of developer convenience, such as Cerny's theoretical 1 TB/s scratch-pad RAM design.
Yeah. I just disagree with the idea that bespoke hw makes sense at all as games evolve. Exact opposite of efficiency
 
Yes of course the modularity serves a whole host of purposes, but that doesn't mean that one of those purposes it's gaming. Due to its modular nature you can indeed build a PC that's wholly unsuited for gaming. However you can also build a PC that's extremely well suited to gaming. Not merely because it has a big powerful GPU in there, but because it has a gaming orientated CPU, motherboard, SSD, memory etc.... Just because the system can be customised for one task, does not mean it's unsuited to another task when configured differently. That's the whole point of modular systems.
Like I've said in this thread, I have a high end gaming pc and play games on it. I like pc games! I like that people play them, and that people make games for them. I think its indisputable that the platform overall -- the whole intersecting set of closed or open standards, OS developers, drivers, api design, and, yeah, the modularity, all produce a platform that undermines games in a lot of way. Some of that comes from the modularity.

If you could buy a pc with a handful of the features I called out -- a game console style OS, a single (or like, really, just a small double digit number of) hardware configuration for tens of millions of users, a set api and driver, I would feel very differently.

But PCs -- the platform created and maintained by all of the interlocking set of stakeholders who make pc hardware and software -- don't and can't have those features.


Low end PC's using APU's absolutely do. And yet they are clearly less well suited to gaming than high end systems. Of course I understand that unified memory makes the devs life simpler - which is great, and it's easier to screw up performance with discrete memory pools. But there are also clear performance advantages of discrete memory pools in the form of higher overall bandwidth and capacity with no contention between the CPU and GPU.

In addition, PS3 and some earlier consoles did not have unified memory, so are we to say that those games consoles weren't intended for gaming?

Being part of a modular platform (and being mostly consigned to the low end) (and running on general purpose operating systems) (and so on) means that nobody is really taking advantages of unified memory, when available on pcs, the way they are on consoles. And, yeah, the ps3 only failed at one of the things I rattled off and (for this reason among others) completely failed to deliver reliable performance or a stable dev environment. I still think it was designed for gaming -- I would think PCs were designed for gaming if everyone agreed to make them hit most of those features, but not all -- but the ps3's legacy is not that of a super well designed gaming machine.


It's a fact of life that devs will always need to cater for multiple target platforms with different feature sets, PC's expand that somewhat (depending on how widely the net is cast) but that doesn't make the platform less fit or unintended for gaming - it's simply a side effect of the upgradable nature of the platform.
PC's expand it a lot. You guys downplay this often so I think you just don't realize how bad it is! I have two assumptions here:

1- Maintaining multiple ways to do the same thing in a game engine is very costly -- not just in terms of hard work, but in terms of stability, QA coverage, just the amount of information and context switching developers need to switch between code that does the same thing radically differently, and even things like player trust -- how many games with optional dx12 toggles improved people's perception of dx12?
2- Very few of your purchasers for any pc game have the latest hardware generation. This doesn't mean they're all low end users -- lots of people bought a 1080 and then decided not to buy a 2060 or whatever later on.

The combination of these two factors means that any intrusive feature -- anything that requires rewriting large chunks of your engine or renderer -- is just not practical to roll out until ~75% or something of users have access to it. This means features people on here complain about a lot -- mesh shaders, sampler feedbakc shading, etc -- are not going to roll out any tim esoon. On consoles, the number is always 100%. That's a big deal for the industry as a whole!



Why does this mean they aren't intended for gaming? If you want to want to run a bank database on your gaming PC while playing a game (lol) then go right ahead. Your performance will be lower, perhaps unplayable even but that's a user choice, not something that has been forced upon you because the system itself is unsuitable for gaming. That's like saying a Lamborghini is unsuitable for racing because in theory I can choose to fit a tow bar and tow a caravan with it while in a race and it would therefore perform worse than a car without a tow bar.

I mean, it might? It's very low to the ground, it's very light weight, etc etc. It was designed with a different task in mind. You could imagine the opposite too: is a regular commuter hatchback designed for racing? I guess you could put a super fast engine and whatever else (i don't know much about cars, sorry) in it -- a skilled mechanic and driver could make it very fast. However, it's working against purpose, design elements would be in the way, it might be less stable, etc.


That's the whole reason for scalability in games. You seem to be operating from the position of users *should* be locked down to a specific hardware configuration and specific developer selected settings, and only that makes for a genuine/fit gaming system.
It isn't about the settings -- it's about the API features, see my point above about waiting for the market to have access to mesh shaders.
Essentially, you're saying here that a platform which has more development challenges on account of its open and scalable nature is less well suited for and not intended for gaming, despite its capability to provide both a better, more flexible and more customizable gaming experience for the end user which is exactly what a large segment of gamers want. I'd have to disagree. You're absolutely correct that consoles are preferable to PC's as a target platform from a developer point of view.
It feels like you guys are talking past me with some of this. I have played more games on my pc in the last week than I have any of my consoles. I'm not a console warrior or a consoles only player. PCs are a very important part of the ecosystem. But their design as a platform undermines games in many ways.

I also like playing games on mobile, and mobile is also a giant market, but if I said "you know what, there are great gaming experiences on iphone, but mobile phones aren't really designed, maintained, or intended by their stakeholders to be for gaming" I feel like nobody here would raise an eyebrow.
 
Last edited:
People - If you use Rivatuner Statistics to cap your frame it can reduce or eliminate the stuttering.

It works so well it constantly has me questioning why using driver level Vsync or the game level Vsync don't work anywhere near as well.

We're talking about shader compilation stutter. Capping the framerate does absolutely nothing for this. These stutters are well beyond 16.6ms.

PC titles in general can have annoying vsync issues on fixed refresh rate displays no doubt, as someone who games on a TV I probably use Nvidia's shitty control panel more than most in going in there solely to force vsync/ ast sync to fix this, and I constantly have Rivatuner or other fps cap methods at the ready. But those have absolutely no impact on shader stutter.
 
Last edited:
The argument that shader compilation stutter is simply inherent to the PC as a platform would be more convincing if the vast, vast majority of this problem didn't manifest with one particular graphics engine.

The very nature of how UE4 handles materials can make this exponentially worse, yes. But it's clear that the step of PSO gathering was not communicated as urgently as it needed to be for developers, either through documentation or through the editor itself.
 
The argument that shader compilation stutter is simply inherent to the PC as a platform would be more convincing if the vast, vast majority of this problem didn't manifest with one particular graphics engine.

The very nature of how UE4 handles materials can make this exponentially worse, yes. But it's clear that the step of PSO gathering was not communicated as urgently as it needed to be for developers, either through documentation or through the editor itself.
It’s equally common with unity games for exactly the same reason — a user friendly way to write shaders that all depend on a (already very large) set of built in variants which many indie users don’t understand.

Really though there’s a selection bias thing here — any dev outside of Ubisoft/ea who makes their own engine tends to have some combination of these three factors:

1- Their game is probably smaller scope because of the work spent on the engine
2- their staff is very experienced, or else they wouldnt manage to finish a game with a custom engine
3- Their engine is probably much closer to the technical cutting edge than anybody else — if they weren’t investing in modern approaches and features they would just be using ue4.

Unity, of course, has a similar system that dumps a list of shaders that were compiled during a play through. That’s how I’m confident that kind of approach is not a magic bullet for all games.
 
It’s equally common with unity games for exactly the same reason — a user friendly way to write shaders that all depend on a (already very large) set of built in variants which many indie users don’t understand.

Is it, though? I don't own many games made with Unity, but I do have a few of the most prominent - Ori and the Blind Forest, Ori: Will of the Wisps, Hollow Knight and Cuphead. None of them suffer from this. In fact you can see both Ori games perform their shader pre-compilation after the initial install or after a driver update, the CPU spikes to 100% and the initial load takes 1-2 minutes.

What prominent Unity game has shader compilation stutter?

Unity, of course, has a similar system that dumps a list of shaders that were compiled during a play through. That’s how I’m confident that kind of approach is not a magic bullet for all games.

Which no one has argued. They're asking for the built in function of UE4 to be used to mitigate the problem for the current generation of games as best as it can rather than be outright ignored. Like I said before, the Shader thread makes it clear the complexity of this issue as a whole, we're focusing on UE4 games in this thread.

The Ascent devs by their own admission were very inexperienced. They added this step and fixed the problem. I think we both agree that this step is clearly not as straightforward as it should be or this problem wouldn't exist for UE4 games as commonly as it does, but you're also speaking like gathering PSO's in UE4 requires having a Tiago Sousa clone on staff.
 
Last edited:
Is it, though? I don't own many games made with Unity, but I do have a few of the most prominent - Ori and the Blind Forest, Ori: Will of the Wisps, Hollow Knight and Cuphead. None of them suffer from this. In fact you can see both Ori games perform their shader pre-compilation after the initial install or after a driver update, the CPU spikes to 100% and the initial load takes 1-2 minutes.

These are 2D games that shouldn’t even need to use unitys built in pbr shaders — this is an extreme case of a game being just right for a small number of shader variants. You could support like, <10 shaders only for those games without inconveniencing your art team or altering your process. Even without any care the vpo count is probably like, a few tens of thousands at most
 
These are 2D games that shouldn’t even need to use unitys built in pbr shaders — this is an extreme case of a game being just right for a small number of shader variants.

They clearly do require a fair number of shaders, hence the lengthy initial load for the Ori titles. It pegs the CPU longer than Psychonauts 2 does on first load after it was patched for shader precompile for example.

You could support like, <10 shaders only for those games without inconveniencing your art team or altering your process. Even without any care the vpo count is probably like, a few tens of thousands at most

You made the claim this problem is 'equally common with Unity games'. So I ask again: What unity games have this issue? Could you point me to some?

First, can someone list prominent Unity titles? I only recall smaller indie style titles.
Guess this is how you define 'prominent' - I'm speaking of in terms of popular. Cuphead and Hollow Knight are extremely popular, they would likely still be considered 'indie' though. Fall Guys may be the most popular Unity game, I've never seen shader stutter as a complaint for it but I don't own it so I can't confirm.

I'm just asking primarily to see what Unity games actually have shader compilation stutter, I mean we're not up in arms about Unreal Engine 4 because some indie title that sold ~1k has shader stutter, it's because big-budget ports from major studios have it too. No game should have it of course, but UE4 wouldn't have this rep among PC users if this was just a sporadic event with tiny indie titles.
 
Last edited:
You made the claim this problem is 'equally common with Unity games'. So I ask again: What unity games have this issue? Could you point me to some?
Unity has this fundamental problem by design. It provides very similar tools to what UE4 does, which is what you're seeing preloaded by ori. Confusingly, unity has decided to call the shader compilation process "Shader loading" -- here's the relevant doc page. https://docs.unity3d.com/Manual/shader-loading.html.
 
Cuphead on the NSW. The devs even talk about addressing it.


“We ran into a slight performance hitch when first instantiating certain enemies in our Run 'n Gun levels. After some digging, we discovered that the shader loading was the culprit. Thankfully, Unity provides the ability to preload shaders using Shader Variant Collections, so although the problem was tricky to identify, it was easy to fix!”
 
Unity has this fundamental problem by design. It provides very similar tools to what UE4 does, which is what you're seeing preloaded by ori.

🙄

Let's recap:

The argument that shader compilation stutter is simply inherent to the PC as a platform would be more convincing if the vast, vast majority of this problem didn't manifest with one particular graphics engine.

You responded with:

It’s equally common with unity games for exactly the same reason

If it was 'equally common', you would have given examples of Unity games with shader stutter by now. You haven't provided one.

Confusingly, unity has decided to call the shader compilation process "Shader loading" -- here's the relevant doc page. https://docs.unity3d.com/Manual/shader-loading.html.

Yes, just as Unreal Engine 4 docs have outlined steps on how to gather PSO's for compiling. We all know this. The frustration here is that some Unreal Engine 4 projects are not utilizing this capability at all, whether it's 100% effective or not is irrelevant.

You were the one advancing the argument that we can't expect small indie developers to deal with this problem in UE4, so you cite Unity as another example where this occurs. Yet, no examples where it does. Instead, you then cite Unity's docs which show the step - which you believe is beyond the capabilities of UE4 Indie devs - that Unity devs are actually using to not have this issue!

I mean really, wtf? Do you even know what you're arguing for at this point?

Cuphead on the NSW. The devs even talk about addressing it.


Cuphead developers said:
“We ran into a slight performance hitch when first instantiating certain enemies in our Run 'n Gun levels. After some digging, we discovered that the shader loading was the culprit. Thankfully, Unity provides the ability to preload shaders using Shader Variant Collections, so although the problem was tricky to identify, it was easy to fix!”

Not entirely clear what point you're making with this, but this is the argument that PC users are also making for UE4 titles, which is "I would like to see the tools the engine provides used to help mitigate the problem".
 
Last edited:
The argument that shader compilation stutter is simply inherent to the PC as a platform would be more convincing if the vast, vast majority of this problem didn't manifest with one particular graphics engine.

The very nature of how UE4 handles materials can make this exponentially worse, yes. But it's clear that the step of PSO gathering was not communicated as urgently as it needed to be for developers, either through documentation or through the editor itself.

And even then developers who treat the PC as a real gaming platform have no problems with something like that. Gears 5 precompiles shaders, for example, and it's using UE4.

Basically you can easily track what games will or won't have shader stutter based on how seriously they treat the PC as a platform. Usually problems arise when a 3rd party is contracted to do a port or a 1st party has very little experience with PC and/or doesn't devote similar dev time to the PC version as they do to the console version.

Hell, even Capcom can do it with precompilation of shaders on game startup with MH: Rise being a recent example. Granted they've been taking the PC development process far more seriously than most Japanese developers, but Japanese devs have historically been pretty bad at ensuring that their PC ports run well.

First, can someone list prominent Unity titles? I only recall smaller indie style titles.

Remnants from the Ashes, Beatsaber, Escape from Tarkov (THE best shooter period, IMO), Subnautica, Cities Skylines, Risk of Rain 2, Rust, Pillars of Eternity, Gunfire Reborn, Pokemon Go, etc...

And of course, there's a plethora of 2D games made using Unity. The Ori games, Hollow Knight, Hearthstone, Disco Elysium, Rimworld, etc...

Granted most of them are probably more prominent on PC or only exist on PC and many are made by indie developers, but I'd argue many are also better than most AAA games. :p

Regards,
SB
 
First, can someone list prominent Unity titles? I only recall smaller indie style titles.
Is this Steam curator list accurate? https://store.steampowered.com/curator/31285130-Unity-Engine-Games/

I see some potentially interesting 3D titles like GTFO, Overload, Firewatch, Yooka Laylee and NASCAR Heat 5. “Isometric” or fairly static overhead 3D (mostly RPGs) like Cities Skyline, Battletech, Pillars of Eternity 1 & 2, Pathfinder Kingmaker and Tyranny probably don’t push things enough. The VR titles (including Overload) would probably make shader stutter obvious.
 
The argument that shader compilation stutter is simply inherent to the PC as a platform would be more convincing if the vast, vast majority of this problem didn't manifest with one particular graphics engine.

Graphics engines on one particular platform. You know why it's easier to avoid on console, and that is because the graphics hardware in the consoles has no variability, whereas on PC it does. It's that simple. On PC, not only does the dev/publisher not know what your graphics card you have when you download a game, they don't know what graphic card you may have tomorrow when you next run the game.
 
Graphics engines on one particular platform. You know why it's easier to avoid on console, and that is because the graphics hardware in the consoles has no variability, whereas on PC it does.

This is shaders 101 that was known when the concept was introduced decades ago, it's not relevant to this thread or my point that is predominantly occurring with one particular engine on the PC.

So it's not 'that simple', or this would be occurring on nearly every game. Shader compiling is a known quantity on the PC as it always has been, which is why it's been tackled in a myriad of ways. The discussion is about why this requirement seems overlooked when it comes to a certain engine, overwhelmingly.
 
Last edited:
If it was 'equally common', you would have given examples of Unity games with shader stutter by now. You haven't provided one.
Ok - I'm not interested in speculating about specific developers' games that I haven't looked at in a profiler. Many 3d unity games display all kinds of stutters and hangs on pc -- these could be for a number of reasons I guess, but given what I know from my professional experience with unity I can say with certainty that shader stutter is an unsolved problem on unity, and prewarming techniques are not as easy as you imagine and not a guarantee of a significant improvement in isolation on every game. If you would like to download some large scale unity games -- things like subnautica, tarkov, whatever -- and see if there are any mysterious stutters that send you on a witchhunt rage whenever a developer ships a unity game -- be my guest.

Or, you make or get a job on a reasonably complex game in unity, build it, see how many shader variants are generated, profile it for shader compilation stalls, and take a crack at getting robust and accurate shader variant collections you can pre-compile and avoid any stutters with.
 
and see if there are any mysterious stutters that send you on a witchhunt rage whenever a developer ships a unity game -- be my guest.

I asked for a single example of a game that has this to back up your own claim that this happens all the time in Unity. You couldn't provide one, so now I'm on a 'witchhunt rage'.

When Sub Nautica was mentioned earlier, I started installing it. Just played 5 minutes of it, obviously a clean shader cache as the last time I played it was years ago on a different system. I think in 5 minutes of performing a suite of actions, jumping into the water, exploring environments - I saw one, maybe two frame spikes in Rivatuner - and that's only how I would notice them, as they were so brief. I would not classify that game as having 'shader stutter' due to those ridiculously minute spikes, and I doubt Digital Foundry or anyone else would either. To me, that was a 'locked' 60fps experience.

There is a (well, literal) gulf between that, and something like pre-patch Sackboy and Psychonauts 2 where entering a new room or performing an action causes a several-frame stutter countless times in the first 10 minutes. I, too, am not interested in looking at these games through a profiler either - I'm far more interested in whether their potential compilation issues actually are visible to the player to any significant degree.

You said it was "equally common" on Unity games. That doesn't mean "Unity requires shader precompiles that won't get everything", because no shit. It doesn't mean that Unity requires some consideration from developers on how to ease this issue from affecting games, because again - no shit. It doesn't mean some hypothetical Unity game that could exhibit this is might have it, if only it was complex enough.

What saying "This is equally common on Unity games" means is that games made with Unity - that actually exist, now, in this reality - demonstrate shader stutter just as much as Unreal Engine 4 games. Yet, you can't provide a single example.

We're identifying specific games, made with a specific engine that has this documented problem to a unique degree as well btw, it's literally the opposite of a 'witchhunt' ffs.

Or, you make or get a job on a reasonably complex game in unity, built it, see how many shader variants are generated, profile it for shader compilation stalls, and take a crack at getting robust and accurate shader variant collections you can pre-compile and avoid any stutters with.

Like has been expressed time and time again, which I think you know, no one is expecting 100% coverage. We want there to be an attempt to mitigate the problem. You just keep moving the goalposts, you are wed to this framing of PC users as entitled brats because they don't want the opening 10 minutes of the game to have massive frame rate drops with every new action and just keep diverting the argument when the actual issues and potential methods to reduce the incidents of this are discussed. You've got your framing and facts be damned, you're sticking to it.

The goal is not have major titles act like running Switch games through early dev versions of Yuzu, not completely unwavering 300fps+ of Doom Eternal. It's not, as you said earlier, that we 'just started noticing' the very concept of shader stutters now and randomly decided to start piling on poor smol indie devs for the hell of it. They're being talked about now because the actual source of the issue has been given attention by a major media outlet, but also because with certain games, usually by a certain engine, the nature of them is incredibly egregious in frequency and and the degree of frametime interruption.

Or, you make or get a job on a reasonably complex game in unity, built it, see how many shader variants are generated, profile it for shader compilation stalls, and take a crack at getting robust and accurate shader variant collections you can pre-compile and avoid any stutters with.

Why? The Unity game devs themselves seem to actually be doing a good job of it. I completely agree I could fuck up a Unity game, really no argument here!

Of course, your argument wasn't that Unity devs have to consider this the same way as UE4 devs do (which backs up my point regardless, as this requires work that both engines have tools to help with and Unity devs in particular seem to have been using). It was that this problem is inherent to popular mainstream engines, and as such it manifests all the time in Unity games too...yet they actually don't exhibit these to anywhere the same degree, at least with any examples you can point to.

Oh ok, so you can't name one - but uh, that's only because I haven't played them all 100% and taken a frametime graph of the entire game, and even so it's extra work to avoid them! This is pretty pathetic my dude.
 
Last edited:
Status
Not open for further replies.
Back
Top