Digital Foundry Article Technical Discussion [2022]

Status
Not open for further replies.
Like seriously the pc gaming situation is in such a bad state. Even if I buy today the best gpu what’s the point? Yeah sad … just sad.
The thing is.. 90% of all of this stuff is avoidable. It literally all comes down to Quality Assurance. Nobody expects perfection. We don't need a game to be perfect. No console game is perfect either... We need it to be much better than it is.

It just seems like a complete lack of due diligence... to ensure that the basic functionality of your game works correctly.

It's important to note that when I speak about QA, I don't mean just the people who test. There may be a "QA department"... but that doesn't mean QA doesn't apply to every single person working at the studios. QA goes all the way up right to management, who are ultimately responsible. So when I say "How did QA not catch this"... I mean, how did nobody up the entire chain, stop this from releasing in that state.

So this completely speaks to a much broader issue about PC gaming. Consoles have certifications which need to be passed, and while that may not directly refer to performance... if there IS a technical issue with a game, it literally costs them money to patch it... so they have more incentive to put out a quality product the first time around. On PC, this doesn't exist. They can release the game in any state, and patch to their hearts content after the fact...

But mostly, this speaks to an apparent lack of care. I don't know how else to say it. It seems like they just don't care... and you hear that sentiment from console fanboys who love hating on the PC platform for all its issues... but sadly, you almost have to admit they are right. Developers don't put nearly the amount of care into these games on PC as they do on consoles. There's a variety of reasons for that.... and you know, PC gamers have largely accepted that.. however we're now at the point where even the most BASIC assurance of quality isn't being met... and now that's why it's such a big issue.

It simply has to change.
 
When Alex says he can't even remember an unreal engine title that came out without problems. Tiny Tina :D Man, that game is so smoooooth. It compiles its shit at startup and then its so buttery smooth. I was reveling in its smoothness when i played it. It felt so good. And its so annoying that it was a thing to be happy about, instead of being normal
 
Same reason games sponsored by Nvidia will not have FSR. I think all these companies need to get over themselves and allow uniformity of technology across all PC hardware.

Withholding others tech is not some kind of marketing gimmick that shows yours is better. These days it just makes you look worse
NFS was "sponsored" by Nvidia but shipped with FSR 2.2 and DLSS 3.0, I'm sure they don't disallow use of other tech tbh. FSR2 also benefits older Nvidia GPUs so in a way it's a positive for Nvidia as well (as long as DLSS remains the best option for RTX GPUs). Same with Spider-Man, shipped with both DLSS 3.0 and FSR2 (and XeSS!).
 
When Alex says he can't even remember an unreal engine title that came out without problems. Tiny Tina :D Man, that game is so smoooooth. It compiles its shit at startup and then its so buttery smooth. I was reveling in its smoothness when i played it. It felt so good. And its so annoying that it was a thing to be happy about, instead of being normal
Exactly!

I can remember another one though. Tales of Arise was another UE4 game which performed silky smooth for me. The only stutters that would happen in that game were a little half frame loading stutter when approaching the edge of the current map to go to another area. Probably just a checkpoint save. There was no compilation stuttering at all in that game, and it was glorious.

That is how it should be, and I'll wait as long as I have to before I get into the game, to have that.
 
Good to at least own up to it. Certainly better than Capcom's usual radio silence for months when a patch is needed.


Early reports from the Steam forums are that the patch released last night didn't do much. Still doesn't look like it's doing any precompile stage.😢

Edit: Or maybe it does now?

Neowin said:
"Thanks for your patience. A PC patch is now available to improve gameplay stuttering issues due to shader compilation," Striking Distance said in a tweet. "After updating, you may see temporary stuttering in the game menu the first time you launch the app."


As for why this fix wasn't included at launch, The Callisto Protocol director Glen Schofield, and creator of Dead Space, has alluded to this being the fault of a wrong file being accidentally included with the release product due to "someone rushing."

The quick update seems to have at least resolved a majority of shader compilation-induced stutters going by user reports. As for performance, the studio added that further optimization-focused updates will be rolling out in the near future. The Callisto Protocol's Steam user reviews are slowly improving already too, with it now touting a "Mixed" rating, up from the launch day's "Mostly Negative" reception.
 
Last edited:
Exactly!

I can remember another one though. Tales of Arise was another UE4 game which performed silky smooth for me. The only stutters that would happen in that game were a little half frame loading stutter when approaching the edge of the current map to go to another area. Probably just a checkpoint save. There was no compilation stuttering at all in that game, and it was glorious.

That is how it should be, and I'll wait as long as I have to before I get into the game, to have that.

Days Gone is another one I think, there were stuttering issues in that game early on but they weren't necessarily related to shader compilation. One part of it was due to their shitty vsync implementation, forcing vsync through Nvidia's CP irons it out 99% for me. You can see it compiling shaders at the loading screen with Rivatuner and watching the CPU usage when you update drivers.

More often than not when I've seen an UE4 game actually implement pre-compiling in a later patch, we're talking about adding ~1-2 mins to the load time, or just high CPU for a bit at the start screen. Sackboy, Psychonauts 2, The Ascent behave like this. I understand every engine is different and shader requirements, especially with UE4, can vary drastically. I just haven't seen a UE4 game that has implemented this step where it's an arduous 20-30 minute process due to Unreal Engine's unprecedented level of shaders, so often it's like "Oh my startup screen is choppy for a bit after I updated my drivers", and gameplay is smoothed out 99% as a result.

Maybe Forza is the worst example when it does it even if you just change graphics settings, but the longest compile stages I've seen are games that actually aren't on UE4. The most egregious example I can think of is Detroit:Become Human, which has a ~10 minute unskippable compile stage, but it's not UE4.

I understand that the way UE4 does it means that it's almost impossible to capture every shader permutation, but there is a hell of a difference between 0 and 95%. Like if Callisto compiled 95% of its shaders in a pre-compile stage, the difference would be you might have a few forum Steam forum threads with "Hey anyone get these occasional odd stutters?", instead of the tsunami of "Unplayable" reviews. As you said, no one expects perfection, hell it's extremely rare to get perfect framerate consistency over an entire play session from any platform.
 
NFS was "sponsored" by Nvidia but shipped with FSR 2.2 and DLSS 3.0, I'm sure they don't disallow use of other tech tbh. FSR2 also benefits older Nvidia GPUs so in a way it's a positive for Nvidia as well (as long as DLSS remains the best option for RTX GPUs). Same with Spider-Man, shipped with both DLSS 3.0 and FSR2 (and XeSS!).
Should be mandated for all games imo. Just would be better for everyone
 
Days Gone is another one I think, there were stuttering issues in that game early on but they weren't necessarily related to shader compilation. One part of it was due to their shitty vsync implementation, forcing vsync through Nvidia's CP irons it out 99% for me. You can see it compiling shaders at the loading screen with Rivatuner and watching the CPU usage when you update drivers.

More often than not when I've seen an UE4 game actually implement pre-compiling in a later patch, we're talking about adding ~1-2 mins to the load time, or just high CPU for a bit at the start screen. Sackboy, Psychonauts 2, The Ascent behave like this. I understand every engine is different and shader requirements, especially with UE4, can vary drastically. I just haven't seen a UE4 game that has implemented this step where it's an arduous 20-30 minute process due to Unreal Engine's unprecedented level of shaders, so often it's like "Oh my startup screen is choppy for a bit after I updated my drivers", and gameplay is smoothed out 99% as a result.

Maybe Forza is the worst example when it does it even if you just change graphics settings, but the longest compile stages I've seen are games that actually aren't on UE4. The most egregious example I can think of is Detroit:Become Human, which has a ~10 minute unskippable compile stage, but it's not UE4.

I understand that the way UE4 does it means that it's almost impossible to capture every shader permutation, but there is a hell of a difference between 0 and 95%. Like if Callisto compiled 95% of its shaders in a pre-compile stage, the difference would be you might have a few forum Steam forum threads with "Hey anyone get these occasional odd stutters?", instead of the tsunami of "Unplayable" reviews. As you said, no one expects perfection, hell it's extremely rare to get perfect framerate consistency over an entire play session from any platform.
100%

Yea, it basically just makes the initial start up take a bit longer.. like Gears 5, and high CPU utilization while in the menus. Sackboy and Psychonauts 2 are the same yep. Just a quick few seconds where the thing is choppy as the into videos play, and then it's good to go pretty much right after it gets to the main menu. That's how it is on my 12 core CPU anyway. Other CPUs may not be quite that quick.

Detroit's lengthy shader compilation is due to how they authored the shaders in the first place for console..obviously without the thought of a port in mind at the time. The game uses a forward renderer and the shaders contained all the lighting code... and on top of that they basically have 10s of thousands of materials which all have unique shader variants and they didn't have time to reauthor them to share as many materials to keep the count as low as possible.

And yep that's exactly what I'm saying. Not getting absolutely everything is understandable.. the odd stutter here and there that happens once is nothing.. and the thing is, that the developers can find out which shaders still cause the hitching, and make a little environment to load up in the beginning as it's compiling all the other shaders, and draw them too.. That's the "hack" that many developers talk about on twitter for getting those pesky shaders that UE can't pre-compile until draw.

It all comes down to how much the developers care to put in the effort to get it to that level of quality. Sadly.. we have a long way to go.. but there definitely are SOME developers out there, which do go above and beyond to ensure the experience is smooth as possible. We can get it to be a lot better than it is.
 
Detroit's lengthy shader compilation is due to how they authored the shaders in the first place for console..obviously without the thought of a port in mind at the time. The game uses a forward renderer and the shaders contained all the lighting code... and on top of that they basically have 10s of thousands of materials which all have unique shader variants and they didn't have time to reauthor them to share as many materials to keep the count as low as possible.

Oh yeah, to be clear - that's a good thing! They thought about it from the start which is why it shipped with that precompile process, just using it as an example of the 'worst case' wrt compile time, and it's still an overall positive. David Cage's writing is far more detrimental to the game.

And yep that's exactly what I'm saying. Not getting absolutely everything is understandable.. the odd stutter here and there that happens once is nothing.. and the thing is, that the developers can find out which shaders still cause the hitching, and make a little environment to load up in the beginning as it's compiling all the other shaders, and draw them too.. That's the "hack" that many developers talk about on twitter for getting those pesky shaders that UE can't pre-compile until draw.

That's similar to how Horizon:Zero Dawn tackles it. When you first load it up after a driver update you can jump right into the game, but when you continue your initial load will be 2-3X longer as it's generating the shaders for the surrounding area where you last saved. Then while you're playing around in that area, it's busy compiling the shaders for the whole map on a lower-priority threads so they don't affect your gameplay. This approach becomes even more applicable when you figure that most games can barely take advantage of anything beyond 6 cores, and especially with Raptor Lake and even budget CPU's now having 10+ cores, this is something you should be able to pass off in the background and not really affect the main worker threads.

That's the ideal way to handle it, but at this point precompiling just at the start of the game for the entire world would be a huge improvement already.

That does nothing. There are some solid UE4 games out there that deserve consumer support. Instead, one should avoid the games which don't get the support to fix what's wrong (regardless of the engine).

Yeah, there's little point to even mention this stuff if we just want to write them off from the outset. Sackboy devs heard the outcry and now that it's fixed, it's one of the better PC ports in recent memory.
 
That motion capture execution is superb! Amazing detail in the virtual performances. Confused by the scope of differences between PS5 and XBS though. Like, put in UE4, press build, it runs. What are the optimisations a dev can do that means a game running on Zen2 & RDNA2+ can perform so differently? It's not like there's a hardware advantage on PS5 that is being lent on here like faster storage. Save faster CPU clocks. :-?

Unfortunately it doesn't quite work like that except for the simplest projects and even then it generally doesn't work like that. While you can have things run, decently with small projects with default build settings in UE4, that's generally not what happens once a project starts being developed in it. Hence, why even small indie projects on UE4 that aren't visually demanding can often run like poo.

At some point most developers that want their project to run well in real-time with users controlling the action are going to have to optimize their code, assets and art such that it runs well. Optimizations are generally targeted at a hardware platform (for example, an indie developer's personal machine). This means that while it'll run well on the optimized platform it may not run optimally on other platforms.

BTW - this would also apply to Unity and any other engine out there.

On PC you can see this where most titles are optimized for NV hardware and thus run less than optimally on AMD hardware. Conversely there are a small number of titles that are better optimized for AMD hardware and run less than optimally on NV hardware. I always get a chuckle out of people that are outraged that a title that runs well on AMD hardware is benchmarked because it was "optimized" for AMD hardware yet turn a blind eye to titles that run well on NV hardware used in benchmarks because they were optimized for NV hardware. :D

Anyway, this is why primary development platform often has an influence on how well a game runs on any given set of hardware. The primary development platform generally has had the most development time spent on it and has the most optimizations. Not many developers have the manpower, expertise, or time to optimize equally for all platforms.

Regards,
SB
 
NFS was "sponsored" by Nvidia but shipped with FSR 2.2 and DLSS 3.0, I'm sure they don't disallow use of other tech tbh. FSR2 also benefits older Nvidia GPUs so in a way it's a positive for Nvidia as well (as long as DLSS remains the best option for RTX GPUs). Same with Spider-Man, shipped with both DLSS 3.0 and FSR2 (and XeSS!).
Most DLSS games have received FSR1 and/or FSR2 with no problems, it's mostly the AMD sponsored titles that forbid the use of DLSS (games like Godfall, RE Village, Far Cry 6, AC Valhalla, RE2, RE3, Sniper Elite 5 ..etc).
Avoid UE4 based titles.....
Sadly this can't be done, by the numbers most titles year by year are UE4 titles. The better option would be to avoid DX12 in UE4 and stick to DX11, unless you are doing ray tracing, as the compilation problem is way way way worse in DX12 compared to DX11.

That's the ideal way to handle it, but at this point precompiling just at the start of the game for the entire world would be a huge improvement already.
I say, give me a third option as well, don't compile shit, just ship the game with a generic version of the shaders (one for NV and one for AMD), and run the game without compiling anything at all, with the reduced performance and everything .. if the user is not satisfied with the level of performance he is getting, he should activate the shader warmup process (pre-compiling), by then the game should take as long as needed to thoroughly compile all of the shader permuations. And the user will not feel bad as he is doing it by choice, for the sake of the ultimate fps.
 
Most DLSS games have received FSR1 and/or FSR2 with no problems, it's mostly the AMD sponsored titles that forbid the use of DLSS (games like Godfall, RE Village, Far Cry 6, AC Valhalla, RE2, RE3, Sniper Elite 5 ..etc).

Yup, they may be exclusivity timed (which is still bad as most dev teams move on to other projects) but whatever agreement AMD has with these studios seems to indicate they're far more restrictive with adding DLSS as opposed to Nvidia sponsored titles adding FSR.

Albeit RE:Village just using FSR2 over their current FSR1 would be a huge improvement, but you know - Capcom.
 
Yup, they may be exclusivity timed (which is still bad as most dev teams move on to other projects) but whatever agreement AMD has with these studios seems to indicate they're far more restrictive with adding DLSS as opposed to Nvidia sponsored titles adding FSR.

Albeit RE:Village just using FSR2 over their current FSR1 would be a huge improvement, but you know - Capcom.

Alternatively, and this would apply especially for Japanese dev studios, they'll implement whichever version can be implemented for the widest variety of hardware possible. So, if they only want to budget for 1 implementation it's going to generally be FSR as anyone with a recent graphics card will be able to enable it. You don't need to implement DLSS in order for as many people as possible to be able to enable upscaling if they want to.

OTOH - if you have DLSS and want as many people as possible to be able to use some form of upscaling, then you also have to implement FSR.

To put it another way.
  • Dev. A is required to implement DLSS due to an agreement with an IHV.
    • To allow other people to be able to use quality upscaling they have to implement FSR or alternatively XESS.
  • Dev. B is required to implement FSR due to an agreement with an IHV.
    • To allow other people to be able to use quality upscaling they have to do ... nothing.
For dev. A implmenting FSR or XESS is required if they want more people to have quality upscaling available, it's not really an option not to enable at least one of those. For dev. B they don't need to do anything to allow as many people as possible access to quality upscaling, DLSS is an option for them which some might budget for and some might not.

I haven't paid to much attention to how well XESS runs on lower end non-Intel hardware compared to how well FSR runs on lower end hardware, so it's possible that might be an alternative for developers that want to have upscaling available to as many people as possible.

Regards,
SB
 
Sadly this can't be done, by the numbers most titles year by year are UE4 titles. The better option would be to avoid DX12 in UE4 and stick to DX11, unless you are doing ray tracing, as the compilation problem is way way way worse in DX12 compared to DX11.
Nah, DX11 is bad. I literally tried the Callisto Protocol with it, and the stutters are still there... performance can even be way worse due to bottlenecks with that API.

DX12 is almost universally better than DX11 for me these days. When DX12 is implemented properly, it simply provides a better experience...

Of course this likely comes down to developers always putting more optimization effort into recent DX12 code paths than DX11 if the game happens to ship with both.

I'm sure I'll get some hate for this post, but it's been my experience for a while now.
 
Status
Not open for further replies.
Back
Top