Digital Foundry Article Technical Discussion [2022]

Status
Not open for further replies.
Nah, DX11 is bad. I literally tried the Callisto Protocol with it, and the stutters are still there... performance can even be way worse due to bottlenecks with that API.

DX12 is almost universally better than DX11 for me these days. When DX12 is implemented properly, it simply provides a better experience...

Of course this likely comes down to developers always putting more optimization effort into recent DX12 code paths than DX11 if the game happens to ship with both.

I'm sure I'll get some hate for this post, but it's been my experience for a while now.
You also have a 4090. For those of us with older Nvidia GPUs, the DX12 experience is almost always sub par.
 
DX12 is almost universally better than DX11 for me these days. When DX12 is implemented properly, it simply provides a better experience...
That's a big when, most UE4 developers don't implement DX12 properly.
Of course this likely comes down to developers always putting more optimization effort into recent DX12 code paths than DX11 if the game happens to ship with both.
That's the point. In some games, the developer just does all the work for one API, leaving the other as a compatibility layer. Shadow of the Tomb Raider is one such example, the devs did everything right for the DX12 path, it ended up much faster than the DX11 path, Hitman 2 and Hitman 3 are glorious examples as well, and I think Callisto Protocol is one such example too, but many games show the opposite, devs only port the game to DX12 while doing the minimum effort to make it run faster than DX11, end result is DX12 being slower than DX11. Even worse, in such cases when you have to use DXR, you pay the penalty of using DX12 first (with no visual improvements), and then you pay the penalty of using DXR (with the ray traced visual improvements).

To list a few examples from UE4: MechWarrior 5, Fortnite, Deliver Us The Moon, Bright Memory, Bright Memory Infinite, Hellblade, The Medium, Observer System Redux, The Ascent, Little Nightmares II, Amid Evil, Loopmancer, Chernobylite, and many others.

Examples from non UE4 titles: RE2, RE3, RE7, RE8, Saints Row 2022, Battlefield V, Watch Dogs Legion.

On the other spectrum, there are those games that does everything right, where the performance of DX12=~ DX11, before DXR is activated, examples include: Ghostwire Tokyo, Ghostrunners, Metro Exodus, Control, Dying Light 2 .. in these games, the devs use heavy and multiple ray tracing effects, so they optimize the DX12 path to it's fullest potential to squeeze every inch of fps once DXR is activated.

Outside of DXR, we have some major wins of DX12 over DX11 in games like The Division 2 and Tiny Tina's Wonderlands, but let's not go down that path too much please, because then I will be talking about the hordes of games where DX11 is spectacularly faster than DX12, and it will get ugly.
 
That's a big when, most UE4 developers don't implement DX12 properly.

That's the point. In some games, the developer just does all the work for one API, leaving the other as a compatibility layer. Shadow of the Tomb Raider is one such example, the devs did everything right for the DX12 path, it ended up much faster than the DX11 path, Hitman 2 and Hitman 3 are glorious examples as well, and I think Callisto Protocol is one such example too, but many games show the opposite, devs only port the game to DX12 while doing the minimum effort to make it run faster than DX11, end result is DX12 being slower than DX11. Even worse, in such cases when you have to use DXR, you pay the penalty of using DX12 first (with no visual improvements), and then you pay the penalty of using DXR (with the ray traced visual improvements).

To list a few examples from UE4: MechWarrior 5, Fortnite, Deliver Us The Moon, Bright Memory, Bright Memory Infinite, Hellblade, The Medium, Observer System Redux, The Ascent, Little Nightmares II, Amid Evil, Loopmancer, Chernobylite, and many others.

Examples from non UE4 titles: RE2, RE3, RE7, RE8, Saints Row 2022, Battlefield V, Watch Dogs Legion.

On the other spectrum, there are those games that does everything right, where the performance of DX12=~ DX11, before DXR is activated, examples include: Ghostwire Tokyo, Ghostrunners, Metro Exodus, Control, Dying Light 2 .. in these games, the devs use heavy and multiple ray tracing effects, so they optimize the DX12 path to it's fullest potential to squeeze every inch of fps once DXR is activated.

Outside of DXR, we have some major wins of DX12 over DX11 in games like The Division 2 and Tiny Tina's Wonderlands, but let's not go down that path too much please, because then I will be talking about the hordes of games where DX11 is spectacularly faster than DX12, and it will get ugly.
Indeed, but remember, I said "these days"... These days if a game has both, and you're on relatively modern hardware... DX12 is usually better. I find that the people who believe DX11 is better in any real tangible way, are often experiencing placebo. They simply think it's better... because it's not DX12.. lol
 
Albeit RE:Village just using FSR2 over their current FSR1 would be a huge improvement, but you know - Capcom.
Could be worse. The Street Fighter collection on PC doesn't let me properly remap controls and always make me play as player 2 in offline matches. Having less than ideal reconstruction options is annoying, but not being able to play as player 1 in a fighting game in inexcusable.
 
Indeed, but remember, I said "these days"... These days if a game has both, and you're on relatively modern hardware... DX12 is usually better. I find that the people who believe DX11 is better in any real tangible way, are often experiencing placebo. They simply think it's better... because it's not DX12.. lol

If this isn't a 'placebo' then, can you name titles where this is the case? I don't doubt some exist across the entire history, as @DavidGraham illustrated there are a few like SOTT (funny enough even Rise of the Tomb Raider, one of the first DX12 titles, is also one where DX12 is demonstrably superior, especially in CPU usage) - but even today I still find it's a relative rarity where DX12 path is more performant than DX11 if the game offers both ime.

You want DX12 of course for enabling RT, but I haven't really seen any firm trend - still! - where a game that offers both API's shows a distinct advantage to DX12. Not every developer is Nixxes, as we all know too well.

Like even in Sackboy, DX11 can show an advantage. It's very slight, but it's there. Resuming a game and sitting at the world map, no RT enabled on either API of course, with the settings I play at:

DX11: 91 fps
DX12: 85 fps

There's also Psychonauts 2:

DX11: 85 fps
DX12: 75 fps

Sure - very marginal. But these are titles where outside of the initial shader compiling issues, no one would say these were unoptimized DX12 games - and yet there's an advantage. To be fair though, I haven't played these games that much in DX11 - perhaps they're less consistent, maybe that's your experience? If that's the case then I certainly wouldn't care about a slight performance advantage as I care far more about consistency.

But overall, I just haven't seen a pattern where DX12 should be the obvious choice if the game offers you DX11 as an option and you don't need RT.
 
Last edited:
I haven't paid to much attention to how well XESS runs on lower end non-Intel hardware compared to how well FSR runs on lower end hardware, so it's possible that might be an alternative for developers that want to have upscaling available to as many people as possible.

Regards,
SB

It's pretty dire. XESS just really doesn't offer anything for non-Intel hardware at the moment, the performance hit (or lack of uplift) is just too large, especially considering the quality on non-Intel hardware to boot. For ARC GPU's, it's solid - outside of those, it's often the worst performer while also having the worst image quality.

As for the calculation that developers have to make when supporting FSR vs. DLSS, that would make sense if there was a considerable effort required to support each, but everything I've read gives the impression much of the work you need to do for a game to enable any form of temporal reconstruction is translatable to both. Hell, DLSS is literally offered as a plug-in for UE4. Nvidia also offers Streamline, a framework which abstracts the calls to the necessary resources that each reconstruction SDK requires:

1670133225574.png

Now it's not universal mind you - there are exceptions like Uncharted 4 where FSR 2 is actually superior to DLSS, but far more often than not, when a game offers both, DLSS gives superior image quality and performance for Nvidia hardware - and like it or not, that's by far the majority of the market. So I'm not so sure the decision to have just FSR 2 in a game can be entirely rationalized as developers simply making a cost/benefit calculation. I think you're possibly over-estimating the cost of implementation and under-estimating the benefit of DLSS as a separate option for Nvidia owners.
 
Strange, this isn't on their man page and has been uploaded 5 hours ago.

Very impressive. Besides the PC version of Control and Cyberpunk 2077, hardly any video from Digital Foundry made me so keen on the graphical and visual presentation of a game. I will definitely play this one. Hopefully the developers will improve a few more technical and gameplay points.
 
If this isn't a 'placebo' then, can you name titles where this is the case? I don't doubt some exist across the entire history, as @DavidGraham illustrated there are a few like SOTT (funny enough even Rise of the Tomb Raider, one of the first DX12 titles, is also one where DX12 is demonstrably superior, especially in CPU usage) - but even today I still find it's a relative rarity where DX12 path is more performant than DX11 if the game offers both ime.

You want DX12 of course for enabling RT, but I haven't really seen any firm trend - still! - where a game that offers both API's shows a distinct advantage to DX12. Not every developer is Nixxes, as we all know too well.

Like even in Sackboy, DX11 can show an advantage. It's very slight, but it's there. Resuming a game and sitting at the world map, no RT enabled on either API of course, with the settings I play at:

DX11: 91 fps
DX12: 85 fps

There's also Psychonauts 2:

DX11: 85 fps
DX12: 75 fps

To really test this you're going to want to ensure you're in a CPU limited situation since that's where the biggest benefits of the low level API kick in.

E.g. what happens if you drop your CPU down to 2Ghz...
 
To really test this you're going to want to ensure you're in a CPU limited situation since that's where the biggest benefits of the low level API kick in.

E.g. what happens if you drop your CPU down to 2Ghz...

It's not a question of how much theoretical performance the nature of each API has, I fully accept DX12 is able to take advantage of CPU threads more efficiently. I'm questioning the claim that in the majority of new games, this actually manifests in a better experience when the player has an option. Purposefully gimping my CPU to half of its performance really isn't relevant to that claim.
 
Hey, Fortnite has been ported to UE 5.1!

And...oh...oh no


If it has shader compilation stutter this is not good at all. I think the best option for Epic is to develop a pre-compilation shader option. It solves everything and save a thread on PC CPU. It is only having an impact on first install or when a driver change. And with the arrival of Direct storage on PC, title will load faster. A little more install time is better than slower loading the dozen or maybe hundreds of time someone will launch the game.
 
Just tested Fortnite and my CPU usage was at 100% for a minute or so in the "connecting" screen when starting a Battle Royale match compiling shaders maybe, dunno. Some stutters were still there as the game started but were pretty soon gone entirely.

Even if it was a bit worse than this and lasted for the entire first round, I'd say shader compilation stutter is still a negligible issue in a multiplayer game like Fortnite as you basically play the same enviroments over and over again.

Battlefield V has shader compilation stutter too but after a round or so the stutters are gone and so the vast vast majority of play time is completely free of scs. I've never found this an issue of any real significance in BFV with more than a 1000h played.

Singleplayer games is where this is a real problem.
 
Yeah Fortnite ran pretty bad when I tried it this morning. Stuttering everywhere. I didn't play for too long though, maybe next round would have been better.
 
Just tested Fortnite and my CPU usage was at 100% for a minute or so in the "connecting" screen when starting a Battle Royale match compiling shaders maybe, dunno. Some stutters were still there as the game started but were pretty soon gone entirely.

Even if it was a bit worse than this and lasted for the entire first round, I'd say shader compilation stutter is still a negligible issue in a multiplayer game like Fortnite as you basically play the same enviroments over and over again.

Battlefield V has shader compilation stutter too but after a round or so the stutters are gone and so the vast vast majority of play time is completely free of scs. I've never found this an issue of any real significance in BFV with more than a 1000h played.

Singleplayer games is where this is a real problem.
I understand you.

But no it's still a problem 100% fullstop. No game should have any stuttering and expect players to build a cache as they play. Even if it's just the very first match.

Should someone accept buying a car that runs like shit until they drive it enough miles to break it in, while having to go through it again and again each time they get it serviced? They know cars shouldn't run like that... and that the model their friend bought runs perfectly fine right from the start. They know what the experience should be like. After all, the company is selling the car. They're advertising it as the same car... They paid the same price for the car as everyone else.. they just unknowingly bought it from a dealership who sells cars that run like shit until they get driven enough.

Is that acceptable?

Nope. Never.
 
But no it's still a problem 100% fullstop. No game should have any stuttering and expect players to build a cache as they play. Even if it's just the very first match.
It's not about if one is ok or not. It's about the impact to the player.
Single player game where 85% of the time you play it is a stutterfest is a lot different than 1 or 2 matches of a multi player game that then runs fine.
I'm not excusing it either, just highlighting that all things aren't equal.

Personally I think DF should just consider them broken and say as such and not invest time and effort analysing them.
Better things to be doing, and if someone else is prepared to do an analysis on a broken game, then good for them.

I wouldn't do follow up analysis on patches unless it said its fixed either.
 
Status
Not open for further replies.
Back
Top