Digital Foundry Article Technical Discussion [2022]

Status
Not open for further replies.
Remnants from the Ashes, Beatsaber, Escape from Tarkov (THE best shooter period, IMO), Subnautica, Cities Skylines, Risk of Rain 2, Rust, Pillars of Eternity, Gunfire Reborn, Pokemon Go, etc...

And of course, there's a plethora of 2D games made using Unity. The Ori games, Hollow Knight, Hearthstone, Disco Elysium, Rimworld, etc...

Granted most of them are probably more prominent on PC or only exist on PC and many are made by indie developers, but I'd argue many are also better than most AAA games. :p

Regards,
SB

I have some of those but not most - do they exhibit shader stutter in your experience? Really at this point I'm curious to find one Unity game that shows this.

Disco Elysium is a stutter fest - but that's because the devs left the default physics and character update values that Unity defaults to at 50hz, so switching your screen to 50hz solves this.

I mean it's not critical to this type of game specifically, and Unity defaulting to this bizarre frequency is well, bizarre - but here's an example of devs that spent years crafting a unique, heartfelt experience and somehow didn't notice that their game was running at 50hz during that entire time, or just didn't care. Not every issue is attributable to unworkable platform complexity people, devs can occasionally miss some pretty obvious stuff.
 
It's been too long since I played Remnants from the Ashes to recall if there were shader stutters or not, but I don't recall any.

I played Risk of Rain 2 and Gunfire Reborn a LOT both in Beta and after release and those didn't have shader stutter that I can recall. Important since both rely heavily on FAST mobility combined with quick aiming and the ability to dodge enemy attacks.

Escape from Tarkov is basically still early access (BETA). In alpha/early BETA the engine was a bit rough but I didn't play enough back then to attribute that to any particular source other than early game code. It runs pretty well now, albeit the engine is still pretty demanding. I love the game, but I haven't played it as much as I'd like due to its PvPvE game focus. I'm getting too old for competitive PvP.

Regards,
SB
 
Point is... Unreal Engine shader stutter seems to come down to who puts in a bit of effort to remember to build a PSO cache... and then actually remember to integrate it into the final build.

That's the piss off about it. Probably 95%+ of these stutters are avoidable. The engine has the capability... We've seen it time and time again where developers have fixed stuttering, or avoided it completely using the engine.

And at the end of the day... again... they are putting out a product. I don't care what it costs them... In the case of Sackboy.. how could they not have known it ran like absolute garbage? It's the first thing you experience when booting it up. So either they didn't test the game... or they just didn't care. And regardless of which one it is... it's not acceptable and has to change.
 
In the case of Sackboy.. how could they not have known it ran like absolute garbage? It's the first thing you experience when booting it up. So either they didn't test the game... or they just didn't care. And regardless of which one it is... it's not acceptable and has to change.

Playing devil's advocate, it's possible that they did test the game, but that nobody thought to test it on a fresh install of the game. Amateurish, sure, as that's one of the first things a quality QA department or company should do, but I can see something like that slipping through if the QA was done entirely in house with people who don't do QA for a living. I could even see it slipping through Sony PlayStation QA (if they had a hand in testing it at any point) just because they are still relatively new to PC.

Regards,
SB
 
Playing devil's advocate, it's possible that they did test the game, but that nobody thought to test it on a fresh install of the game. Amateurish, sure, as that's one of the first things a quality QA department or company should do, but I can see something like that slipping through if the QA was done entirely in house with people who don't do QA for a living. I could even see it slipping through Sony PlayStation QA (if they had a hand in testing it at any point) just because they are still relatively new to PC.

Regards,
SB
The results are in, and Shaderbot has determined that's highly unlikely 🤖

Shaderbot says it would have become immediately apparent once the game launched. They would have known, because he mentioned it in the forums multiple times that it would have stutter... even before the game launched. A patch released.. and there was no fix or word about it. It was only until after Alex made that video.. and put them side by side, which made the PC version look absolutely terrible in comparison... that anything was done about it.
 
Like I've said in this thread, I have a high end gaming pc and play games on it. I like pc games! I like that people play them, and that people make games for them. I think its indisputable that the platform overall -- the whole intersecting set of closed or open standards, OS developers, drivers, api design, and, yeah, the modularity, all produce a platform that undermines games in a lot of way. Some of that comes from the modularity.

If you could buy a pc with a handful of the features I called out -- a game console style OS, a single (or like, really, just a small double digit number of) hardware configuration for tens of millions of users, a set api and driver, I would feel very differently.

But PCs -- the platform created and maintained by all of the interlocking set of stakeholders who make pc hardware and software -- don't and can't have those features.




Being part of a modular platform (and being mostly consigned to the low end) (and running on general purpose operating systems) (and so on) means that nobody is really taking advantages of unified memory, when available on pcs, the way they are on consoles. And, yeah, the ps3 only failed at one of the things I rattled off and (for this reason among others) completely failed to deliver reliable performance or a stable dev environment. I still think it was designed for gaming -- I would think PCs were designed for gaming if everyone agreed to make them hit most of those features, but not all -- but the ps3's legacy is not that of a super well designed gaming machine.



PC's expand it a lot. You guys downplay this often so I think you just don't realize how bad it is! I have two assumptions here:

1- Maintaining multiple ways to do the same thing in a game engine is very costly -- not just in terms of hard work, but in terms of stability, QA coverage, just the amount of information and context switching developers need to switch between code that does the same thing radically differently, and even things like player trust -- how many games with optional dx12 toggles improved people's perception of dx12?
2- Very few of your purchasers for any pc game have the latest hardware generation. This doesn't mean they're all low end users -- lots of people bought a 1080 and then decided not to buy a 2060 or whatever later on.

The combination of these two factors means that any intrusive feature -- anything that requires rewriting large chunks of your engine or renderer -- is just not practical to roll out until ~75% or something of users have access to it. This means features people on here complain about a lot -- mesh shaders, sampler feedbakc shading, etc -- are not going to roll out any tim esoon. On consoles, the number is always 100%. That's a big deal for the industry as a whole!





I mean, it might? It's very low to the ground, it's very light weight, etc etc. It was designed with a different task in mind. You could imagine the opposite too: is a regular commuter hatchback designed for racing? I guess you could put a super fast engine and whatever else (i don't know much about cars, sorry) in it -- a skilled mechanic and driver could make it very fast. However, it's working against purpose, design elements would be in the way, it might be less stable, etc.



It isn't about the settings -- it's about the API features, see my point above about waiting for the market to have access to mesh shaders.

It feels like you guys are talking past me with some of this. I have played more games on my pc in the last week than I have any of my consoles. I'm not a console warrior or a consoles only player. PCs are a very important part of the ecosystem. But their design as a platform undermines games in many ways.

I also like playing games on mobile, and mobile is also a giant market, but if I said "you know what, there are great gaming experiences on iphone, but mobile phones aren't really designed, maintained, or intended by their stakeholders to be for gaming" I feel like nobody here would raise an eyebrow.

To avoid this developing into an ever growing point for point back and forth, I'll try to boil it down a bit.

You are arguing that consoles are more optimised for gaming than PC's on account of them having locked down configurations and streamlined OS's / API's software. On this we agree. Consoles are more optimised for gaming which results in them being somewhat more efficient in some respects than PC's.

You're also arguing that PC's are less fit for gaming on account of the above, but this is where we disagree because it depends on how you define "fit for gaming". You clearly define it as how easy they are to develop for, and related to that, how much easier it is for suboptimal gaming results to manifest (like shader compilation stutter) resulting from the additional required developer effort and/or expertise. Based on that definition alone, I guess I don't disagree. However from an end user perspective I define "fit for gaming" quite differently. I define it (based on personal preference) as offering the highest levels of gaming performance. The pace at which that performance progresses. The ability to upgrade my gaming performance at whichever point I choose. The ability to customise my gaming experience to my own personal preferences, and yes even the ability to switch between gaming and non-gaming apps seamlessly.

So based on my definition (and I absolutely understand and accept other end users will have different priorities) then the PC is the "most fit for gaming" platform, and if that makes the lives of developers a little more difficult to accommodate it then so be it. And yes, I'll still demand the same levels of quality that the console releases get because I'm paying for the game just like they are. I'll also expect games to be customised to at least some degree to my platforms unique qualities just like every other platform user would. Just because the lives of developers would be easier without PC's as a gaming target in no way makes them a less fit platform for gaming in my eyes. Obviously if the developers and publishers decide that the extra effort is not worth it and stop releasing games for the PC - which they absolutely could, I'd be inclined to change my mind. But since the literal opposite is happening with PC now enjoying the widest selection of games across all platforms, I really don't see the developer effort issue as having any impact on my perception of how fit the platform is for gaming.

Here's the crux of it... how easy it is to develop a game for a specific platform does not equate to how fit that platform is for gaming.

Finally, you argued that PC's are not intended for gaming, which again I disagree with. I agree they are not intended specifically for gaming only. And as a result they can be less optimised than consoles for gaming, particularly the software stack. But that doesn't mean they are not intended to be used as gaming machines at all. If that were the case when why would Direct3D or Direct Storage exist in the first place? Why would products like the 5800X3D or O/C friendly motherboards exist along with a plethora of other non GPU gaming focussed hardware? Why would Windows even support game pads? PC's (in certain configurations) are very clearly intended and even tailored for gaming even before you consider the GPU's. The fact that they can also do other things, and are more open and customisable, and can be upgraded on demand makes them less efficient than consoles as pure gaming devices, and makes them more difficult to support by developers, but in no way means that they aren't actually intended to be used as gaming machines at all.

I'd suggest if none of the points above resonate with you then we just agree to disagree here.
 
We're talking about shader compilation stutter. Capping the framerate does absolutely nothing for this. These stutters are well beyond 16.6ms.

PC titles in general can have annoying vsync issues on fixed refresh rate displays no doubt, as someone who games on a TV I probably use Nvidia's shitty control panel more than most in going in there solely to force vsync/ ast sync to fix this, and I constantly have Rivatuner or other fps cap methods at the ready. But those have absolutely no impact on shader stutter.

And yet games that stutter like Stray have noticeably less stutter to me (And others on Twitter) when capping using Rivatuner.

Maybe try it before jumping to the 'You're wrong' mentality.
 
I have some of those but not most - do they exhibit shader stutter in your experience? Really at this point I'm curious to find one Unity game that shows this.

How many Directx 12 based Unity titles are there? I doubt there are many if any especially due to the fact that Unity just released the DX12 backend out of experimental state at the end of June. Unless Unity game devs have no problem with such designation for use in the production of their commercial products.
 
Last edited:
How many Directx 12 based Unity titles are there? I doubt there are many if any especially due to the fact that Unity just released the DX12 backend out of experimental state at the end of June. Unless Unity game devs have no problem with such designation for use in the production of their commercial products.

Since when is shader stutter DX12-exclusive? Regardless I fail to see the point, the claim was that Unity games exhibited this problem just like UE4 games, not potentially in the future.

And yet games that stutter like Stray have noticeably less stutter to me (And others on Twitter) when capping using Rivatuner.

Maybe try it before jumping to the 'You're wrong' mentality.

I use Rivatuner constantly, along with Special K, various vsync methods, dxvk, etc - I'm very sensitive to frametime consistency and have used every trick in the book to improve it. I mean here is what my Rivatuner profile listing looks like:

1669317934346.png

The very nature of shader stutters in terms of the ms impact they impart preclude a simple framerate cap from remedying this. Your one-game, anecdotal example notwithstanding.

Rivatuner and other framerate cap methods can help with games that have poor frame pacing (and you're right - it shouldn't be as necessary as it is, so many games have shit framerate caps) they can potentially reduce stutter that are caused from other sources - not every stutter you see is due to shader compilation. When a realtime, non-asynchronous shader needs 80ms to compile, a 16ms cap is going to do jack-shit in those cases.

Stray btw, is far from the worst offender wrt shader stuttering anyways, it's always been pretty minor and I wouldn't dissuade a PC user from avoiding it due to that.
 
Last edited:
Since when is shader stutter DX12-exclusive? Regardless I fail to see the point, the claim was that Unity games exhibited this problem just like UE4 games, not potentially in the future.

Shader stuttering is not exclusive but it’s more problematic on DX12 and that api represent some of the worse offenders. DX12 seems to trade heavier runtime compilation for more optimal shaders. While DX11 the heavy lifting is done offline but leads to slower shaders.

DX12 vs DX11

Unity games do not seem to exhibit the issue enough to be perceived as problematic as it is with unreal based games (I don’t know why anyone would assert such claim) but that fact alone doesn’t establish that the Unity engine is better at avoiding it. It not like Unity is readily used in AAA highend visuals shader heavy games

DX12 just became the default API for Unity on PC so we will see how unity games fare but hopefully this is an unreal specific issue or devs in general comes up with a solution that mitigates this issue in the future.
 
new patch published by Nixxes. It adds improvements to RT. I will be using the DF optimised settings when I start playing it, once I complete the games I have pending to finish.


Patch Notes
  • Various visual improvements to ray-traced shadows.
  • Improved quality of certain objects in ray-traced reflections.
  • Adjusted lighting in cutscenes to match the original game.
  • Improved cutscene performance.
  • Addressed a bug that could cause image corruption on Intel ARC GPUs when using Dynamic Resolution Scaling.
  • Stability improvements and optimizations.
 
Shader stuttering is not exclusive but it’s more problematic on DX12 and that api represent some of the worse offenders. DX12 seems to trade heavier runtime compilation for more optimal shaders. While DX11 the heavy lifting is done offline but leads to slower shaders.
...
To correct you a bit, DX12 isn't really more problematic. DX12 just leaves more things up to the developer so they must create a workarround for their problems. If they decide to just use the Jit-way ("compile it when it is needed") than yes, they have the problem. DX12 leaves all the options on the table. Well, DX11, too but in DX12 there is just more that can and must be optimized by the developer. This is the new freedom. Btw, Vulcan has the exact same problem. This is really not DX12 exclusive.
We "just" need more loading screens to compile the stuff for the system before the game starts. For consoles this is a lot easier as they can just deliver the result. On PC we would need "results" for almost every configuration and those wouldn't be future compatible, so they must be recompiled when needed (or at least when the game is started with new hardware).

new patch published by Nixxes. It adds improvements to RT. I will be using the DF optimised settings when I start playing it, once I complete the games I have pending to finish.


Patch Notes
  • Various visual improvements to ray-traced shadows.
  • Improved quality of certain objects in ray-traced reflections.
  • Adjusted lighting in cutscenes to match the original game.
  • Improved cutscene performance.
  • Addressed a bug that could cause image corruption on Intel ARC GPUs when using Dynamic Resolution Scaling.
  • Stability improvements and optimizations.
wow, "ARC"-specific bugfixes ... I guess we are still far away from stable APIs (Directx/OpenGL/Vulcan) that the drivers deliver the same result for every possible command.
 
wow, "ARC"-specific bugfixes ... I guess we are still far away from stable APIs (Directx/OpenGL/Vulcan) that the drivers deliver the same result for every possible command.

That's a vendor (Intel) specific implementation issue and not an API issue. If it was because of an unstable API then the bugfix would be API specific rather than vendor hardware specific.

Regards,
SB
 

Mindblowing. Games from the golden era in todays next generation graphics, its perfect. Games todays are kinda meh, only graphics thats nice these days, if even that.
didn't Quake have a RT version that DF featured already? (I haven't watched this new video atm) I can recall there was one game which had two different RT iterations, but for the life of me I don't remember whether it's Quake or not.
 
didn't Quake have a RT version? I can recall there was one game which had two different RT iterations, but for the life of me I can't recall whether it's Quake or not.

Quake 2 had a ray tracing version (fully path traced). The first quake for about a month ago i think, its DF covering it today in full.
 
Quake 2 had a ray tracing version (fully path traced). The first quake for about a month ago i think, its DF covering it today in full.
hmmm I know about Quake 2 RTX, but there was another RT version of Quake, iirc? I remember in the lava boss how all the scenery turned into a lava colored environment. In this one, that effect is a lot more subtle. That's why I think there are two RT implementations from different authors.

I purchased the recent version of Quake, can't wait to try this. The path traced Quake 2 RTX managed to surprise and impress me, not because of the graphics ,but the little details of the light that could pass up as the reality which you don't expect to see in a game.
 
Last edited:
Status
Not open for further replies.
Back
Top