Digital Foundry Article Technical Discussion [2024]

It was pretty simply, you couldn't say your Dx7 GPU was a Dx8 one could you?

You couldn't claim your Dx10 GPU was a Dx11 one could you?
Many Direct3D APIs have always had some semblance of hardware backwards compatibility. The only major D3D API iteration in recent memory that tried to break this trend was D3D10 with very radical changes (no fixed function & separate shader/state objects) to state management a long with groundbreaking new features both for the worse (geometry shaders/stream out) and for the better (compute shaders/dual source blending) later on (D3D10.1) ...

As for your last sentence, Microsoft seems to think differently from you ...
By allowing 2012 Dx11 GPU's to be 'classed' as Dx12 ones is wrong and causes confusion in the consumer space, leads to people thinking they don't need to upgrade to a more modern GPU and can/could slow down adoption rate of newer GPU's.

So yea, there's a big potential for a knock in effect.
Ultimately, it's up to the developers to define their hardware requirements to drive up new hardware adoption. The gfx API isn't the end game here but it's the software ...
Which further adds to the confusion with the consumers as the hardware capabilities between a Dx12 GPU and a Dx12U one are very very different.

Hardware RT
VRS
Mesh Shaders
Sampler Feedback

That is a lot of hardware feature set additions and certainly could have easily justified being a new Dx.
It's not about the 'number' of new features since there were already tons of small feature updates ever since the initial release of D3D12 and before 'Ultimate'. An API defining feature is about pushing the limits of existing programming models of which all of the above doesn't do very much ...

RT and mesh shading doesn't qualitatively impact the way we do graphics programming ... (no callable shaders/function pointers in graphics/compute pipelines and no stream out w/ mesh shaders)
All of which are not supported in Dx12 GPU's.

Arguably the largest addition to Dx11 was Tessellation, and yet Microsoft didn't just release it as Dx10.2.
Microsoft couldn't release a hypothetical 'D3D10.2' because they abandoned the reference counting behaviour in D3D11 but I assure you that both D3D10 and D3D11 are undeniably more similar API designs than you would initially believe and that's the assessment shared by other graphics programmers as well if you ask them ...
 
None of the DX12U features have been added to DX11. Like the Khronos Group with OpenGL, it seems that Microsoft has decided to leave its high-level API behind.
With how the CPUs are making the calls to the GPUs today, no I don't think it would fall into that. Work submission, memory management, synchronization is completely different on DX12.

I suspect It would be painfully slow to have DXR running on DX11. Yes you can run compute shaders to do it (and let's assume the API and drivers were made to support it), but I'm not seeing the advantages here of porting it there considering how thread locked you are in terms of submission. DX11 is painfully serial, and I'm just not seeing how that is going to be very useful for a technique that is stochastic. No executeIndirect either I believe, no workgraphs coming for it either.

You are correct that DX11 is being left behind. The latest features will all require the latest API.
 
Last edited:
Isn't this kind of a moot discussion in the consumer context at this point? Almost every consumer game requirements I see - and hell even the UE5 feature requirements for developers - are listed in terms of "this generation or newer of AMD, this generation or newer of NVIDIA, etc", not in terms of DirectX feature level. When there are (significantly) fewer IHVs than relevant features, as long as IHVs continue to make future generations and driver updates support a superset of features then it's just far easier and clearer to specify IHV hardware generations and driver versions anyways. This of course also relies on IHVs not to fuck up their marketing names to the point where they rename older stuff to newer stuff with fewer features; this has been slightly better in recent years though.

Frankly I don't think consumers need to know or care about things like "DirectX 12 Ultimate" or "Feature Level 12_2" and what it means. They just look at the requirements and see if their GPU is older or newer than the given generation listed.
 
Isn't this kind of a moot discussion in the consumer context at this point? Almost every consumer game requirements I see - and hell even the UE5 feature requirements for developers - are listed in terms of "this generation or newer of AMD, this generation or newer of NVIDIA, etc", not in terms of DirectX feature level. When there are (significantly) fewer IHVs than relevant features, as long as IHVs continue to make future generations and driver updates support a superset of features then it's just far easier and clearer to specify IHV hardware generations and driver versions anyways. This of course also relies on IHVs not to fuck up their marketing names to the point where they rename older stuff to newer stuff with fewer features; this has been slightly better in recent years though.

Frankly I don't think consumers need to know or care about things like "DirectX 12 Ultimate" or "Feature Level 12_2" and what it means. They just look at the requirements and see if their GPU is older or newer than the given generation listed.
I dont know man, I pretty much completely ignore 'game requirements' cuz I dont think a lot of thought or testing really gets put into them in the first place. And it's extremely rare that some game releases with a minimum requirement for an extremely modern GPU. Which also says a lot about how slowly most of these technologies tend to get adopted.

I agree with the point that consumers dont tend to need to know these things specifically, but mainly cuz it's just not practically gonna be an issue. You only need to worry about anything if you've got an older GPU, and even then, you should be looking at actual user impressions about this to judge whether you'll be good or not.

Alan Wake 2 is probably one of the only examples I can think of where any of this would have caught any significant number of people out.
 
You only need to worry about anything if you've got an older GPU, and even then, you should be looking at actual user impressions about this to judge whether you'll be good or not.
Sure but that's kind of a tangential discussion to the whole issue of DirectX versions. DirectX versions are at best going to give you some indication of whether a game will run *at all*; that's the best and most they have ever done. Minimum game requirements should do that and potentially give you some idea of whether the experience will be reasonable at all. Certainly I would recommend most folks go further and look into how *well* a game runs in terms of performance and visuals on their hardware if they are concerned that their hardware is on the line, but that's really a separate question than the DirectX version question of "will it run at all".

Alan Wake 2 is probably one of the only examples I can think of where any of this would have caught any significant number of people out.
Right, and like most games, it stated it's system requirements in terms of GPU generations which is far more useful than if they had tried to describe to people DirectX versions and feature levels, which are not even something that is advertised on GPUs these days anyways.

My point is I don't think this is a real problem. Stating requirements in terms of GPU model names is both easier and makes more sense than the developer-focused feature matrices. End users shouldn't need to even know a game uses DirectX or Vulkan or whatever, let alone the detailed minutia of how the API is used.
 
Last edited:
Yea, and I think IHVs and Microsoft with Xbox have bungled it up even worse in recent years. By all means, market your hardware to consumers with supported graphical features (RT, DLSS, FSR, FG, ect) but cut it out with hyping up API tech and features (Mesh shaders, SFS, VRS, DStorage, ect) which quite frankly, just sets up those who are less informed (like myself in some of those cases) to have unrealistic expectations of what will be possible, or even when/if it will ever be utilized. We've learned by now that just because something is supported doesn't mean it will actually be utilized in a meaningful capacity.

Consumers just need to know the GPU model/generation a game supports.
 
Yea, and I think IHVs and Microsoft with Xbox have bungled it up even worse in recent years. By all means, market your hardware to consumers with supported graphical features (RT, DLSS, FSR, FG, ect) but cut it out with hyping up API tech and features (Mesh shaders, SFS, VRS, DStorage, ect) which quite frankly, just sets up those who are less informed (like myself in some of those cases) to have unrealistic expectations of what will be possible, or even when/if it will ever be utilized. We've learned by now that just because something is supported doesn't mean it will actually be utilized in a meaningful capacity.

Consumers just need to know the GPU model/generation a game supports.
But if the other guys have blast processing and you don’t, you’re screwed!
 
Few people know this, but blast processing was actually a foreign technology used by Sega in the Japanese secret army. The code extracted from the decryption of the computer of a crashed UFO was capable of incredible performance... Since it was found to be dangerous, the project was canceled instead. Some Megadrive games used it in traces. The documents are lost...

":)"
 
Last edited:
It's an easy cherry-pick/snipe, but worth mentioning if you watched the whole DF video the takeaway is pretty clearly that many of those settings combinations - particularly that one - should not exist in the game, especially on console. I'd argue the frame gen modes should not exist at all given the results from the DF video. The way these tweets are worded is pretty misleading, intentionally or otherwise.

I won't hammer my point about testing and reporting "frame rate" numbers with frame gen, but this is a case where it's almost misleading in the opposite direction.
 
These types of opinions will simply be unavoidable, whether right or wrong. Just like anything else there will be games which do a good job of implementing a technology and others which do bad job of it, causing these types of responses from gamers and developers.

I honestly think consoles are pushing too far in the direction of providing modes/options that they're beginning to do a disservice to the consumers and developers themselves. The consumers MAY THINK they want all these modes/options.. and in some cases they're nice to have.. but it's getting more and more complicated. Consoles used to be about putting in a game and playing a tightly curated experience that was specifically tuned for the hardware from the developers. That's what always separated it from PC gaming. Yes, of course developers are still tuning their games for consoles, but now you have the pressure to include more modes which takes time/money/focus from the studio to do, and cause the players to have to think and test, which modes they prefer.. which interrupts that smooth "pop the game in and go" experience of console.

Yes I'm aware there's still the "default" mode when it loads and nothing is stopping you from just loading up and playing.. but a LOT of people will fall into the trap of having to figure out and decide what's best for them.. instead of just playing. It's unneeded friction.

I think DF has indirectly facilitated a lot of this as well. It's interesting content to watch for sure, but I think it's going to become so overly complicated when you've got a bunch of TVs with different refresh rates you possibly have to mess around with, changing settings in the console menu to get things to work, HDR modes, frame gen, RT modes, 120hz modes, 40fps modes, VRR modes, fidelity, performance, locked/unlocked, Vsync.. ect, ect.

The more devs mess around with this stuff the less focus there is on making the tight curated experience consoles were always known for, and it will undoubtedly come at a quality cost in many cases. I bet you never see Nintendo make their games that complicated for the consumer. I think we need to start pushing back on it.
 
Last edited:
Part of the problem for consoles is once upon a time everyones tv had the same resolution ntsc or pal now thats no longer the case
 
Part of the problem for consoles is once upon a time everyones tv had the same resolution ntsc or pal now thats no longer the case
Pal was always higher resolution, with better colour resolution to boot.

But NTSC was higher frame rate, and succeeded despite being shit.
 
But NTSC was higher frame rate, and succeeded despite being shit.
20% higher framerate is more important than 20% higher resolution. Plus, 60 is divisible by 1,2,3,4,5,6,10,12,15,20,30, and 60. 50 is only divisible by 1,2,5,10,25,50. This give a bit more leverage when updating below the refresh, especially when you miss an update.
 
Last edited:
Back
Top