Direct3D feature levels discussion

I think in terms of choice during gaming players may be locked to use one or another method. It was not discussed in the MS DirectSR links and think we will have to wait for the GTC 2024 demo/Q&A to see how it will function. Personally I believe players will be able to chose any/all methods developers enable that is compatible with their cards. (like today)

Yes it’s likely DirectSR works like other DirectX interfaces where the implementation is completely controlled by the driver. However there’s nothing stopping developers from providing direct integrations of DLSS, FSR etc just like they do today separately from DirectSR. It’s no different to offering MSAA (via DirectX) and FXAA/SMAA etc (via custom libs).
 
How does DirectSR remove options? If as you say developers should be willing to implement multiple versions of the same thing they can continue to do so to their heart’s content and choose from:

1. DirectSR based upscaling
2. FSR
3. XeSS
4. New DLSS that’s incompatible with DirectSR
5. Custom TAAU
We have no idea how DirectSR will work and if what you're describing would even be possible. Let's wait.

However there’s nothing stopping developers from providing direct integrations of DLSS, FSR etc just like they do today separately from DirectSR.

1709472601121.png
 
I don't understand the point of needing to expose an interface like Direct Super Resolution which goes against the entire design philosophy of consistency in modern Direct3D APIs. Usually, Direct3D as a gfx API is about guaranteeing cross-vendor compatible results notwithstanding any undefined or implementation defined behaviour. This new API feels like something totally out of the left playing field from Microsoft unless the intended purpose is to provide an easier way for hardware vendors that do want to push their own proprietary solution to do interop with applications ...
 
I don't think the data inputs/outputs for DirectSR are fixed, ie if Nvidia quality/performance improves by including an additional parameter then it will be added.
The only problem will be it is on MS timetable, not Nvidia's. Might still be a case for upscaler process to be able to circumvent DirectSR functionality.
 
I don't understand the point of needing to expose an interface like Direct Super Resolution which goes against the entire design philosophy of consistency in modern Direct3D APIs. Usually, Direct3D as a gfx API is about guaranteeing cross-vendor compatible results notwithstanding any undefined or implementation defined behaviour. This new API feels like something totally out of the left playing field from Microsoft unless the intended purpose is to provide an easier way for hardware vendors that do want to push their own proprietary solution to do interop with applications ...
Only upside I can see is supporting a standard interface for a feature DX currently doesn't support but one day can slot into that position.
 
I don't understand the point of needing to expose .., Direct Super Resolution which goes against the entire design philosophy of consistency in modern Direct3D APIs
Only upside I can see is supporting a standard interface for a feature DX currently doesn't suppor

I really can't understand the drama. There are currently one open-source (AMD FSR3) and two closed-source (NVidia DLSS and Intel XeSS) vendor-specific APIs, which are going to be superceded by a unified API based on existing Direct3D 12 rendering path - probably an extension of existing MSAA quality levels - that maps into GPU-specific driver implementations. Which part of that goes "against the entire design philosophy" of Direct3D?

I don't think the data inputs/outputs for DirectSR are fixed
If DirectSR uses a custom metacommand to perform upscaling, similar to "DirectStorage" metacommand for texture decompression, these can have arbitraty inputs and outputs as they are not meant to be used by programmers direclty, but rather by the runtime API to programmatically construct the optimal code graph, much like XML schemas and other machine-readable data formats.

We have no idea how DirectSR will work and if what you're describing would even be possible.
FSR and XeSS upscaling will continue to work just as before, these are based on custom processing plug-ins that use HLSL shaders and aren't even limited to specific hardware from AMD or Intel (though Intel also provides an Arc Xe optimized implementation).
 
Last edited:
Isn't XeSS still closed source?
Yep, there are conflicting messages from Intel, as they promised an 'open source' implementation back in 2021, but current SDK releases are closed-source and only refer to 'open standards'...

 
Last edited:
Which part of that goes "against the entire design philosophy" of Direct3D?

I interpreted that to mean Direct3D’s philosophy is to provide common APIs while guaranteeing consistent output across vendors. DLSS, XeSS and FSR certainly don’t produce the same output. I can understand the point. However it’s still a huge improvement over the current mess.

It would be great if Microsoft could dictate some sort of minimum error or SNR metric that upscalers would be required to meet but that seems near impossible to do in a universal game agnostic way.
 
They don’t produce the same output as each other.
Why would they? They are different techniques. Different TAAs don't produce the same output either, why not have an API for that too?
The point I think is that there seem to be a different level of abstraction behind the idea of DirectSR.
One would think that the only thing missing from DX when it comes to these upscaling solutions is the support for matrix multiplication h/w. Presumably this one will come with WaveMMA "soon". That's the extent of support on API side actually needed here.
 
Why would they? They are different techniques. Different TAAs don't produce the same output either, why not have an API for that too?
The point I think is that there seem to be a different level of abstraction behind the idea of DirectSR.
One would think that the only thing missing from DX when it comes to these upscaling solutions is the support for matrix multiplication h/w. Presumably this one will come with WaveMMA "soon". That's the extent of support on API side actually needed here.

Exactly. They are different techniques that perform the same “DirectSR” function which is the issue I think Lurkmass is highlighting. If DirectX is intended to guarantee consistent output across vendors then DirectSR doesn't fit that paradigm.

I don't think it's a different level of abstraction. This is no different to anisotropic filtering or MSAA.
 
I don't think it's a different level of abstraction. This is no different to anisotropic filtering or MSAA.
It's not a h/w function which the application can use (AF and MSAA are), all these upscalers are basically middleware solutions. Integrating them into DX even at the level of a "common interface" would be similar to integrating there a common interface for a physics middleware for example. It is a different level of abstraction which is why it looks weird in a DX API family.
 
DLSS, XeSS and FSR certainly don’t produce the same output... It would be great if Microsoft could dictate some sort of minimum error or SNR metric that upscalers would be required to meet
I don't think it's a different level of abstraction. This is no different to anisotropic filtering or MSAA.

If image upscaling is an extension of MSAA, I'd expect all upscaling methods to be precisely defined, just like bilinear/trilinear/anisotropic texure filtering - so there would be detailed specifications on DirectX Specs and an WARP12 implementation, just like it happened with other major features introduced in the last ten years. Judging by earlier examples, the common API would certainly draw on prior proprietary technologies, but it won't simply copy previous designs to the letter.


As for motion frame generation, it's still an evolving technology and there is no universally accepted post-processing algorithm to serve as a reference implementation. Current designs are much like Phong/Gouraud color/lighting systems in early days of 3D graphics - once hardware became more powerful, these were supreceded by superior methods like textures and shader programs (which in turn are being superceded by global illumination techniques based on ray/path-tracing, which require much higher levels of processing peformance).

Ideally, post-processing should be user-pluggable, so there would be a standard DirectML-based reference and proprietary driver-based implemenations, each with a predefined set of quality/performance controls, as well as custom post-processing callbacks for those developers who want finer control over quality/performance...
 
Last edited:
It's not a h/w function which the application can use (AF and MSAA are), all these upscalers are basically middleware solutions. Integrating them into DX even at the level of a "common interface" would be similar to integrating there a common interface for a physics middleware for example. It is a different level of abstraction which is why it looks weird in a DX API family.

Clearly I disagree with that framing. Resolution scaling is a relatively simple 2D image manipulation function in the grand scheme of things. I would argue it's as primitive a function as anti-aliasing or texture filtering. It is nowhere in the same ballpark of complexity or lines of code as physics middleware.

Btw I'm not sure why fixed function h/w functionality is relevant here. A lot of DirectX's behavior and APIs are implemented in software on both the CPU and GPU.
 
If image upscaling is an extension of MSAA
Why do you think that it would be an extension of MSAA? There's nothing in common between MSAA and these TAA-based upscalers.

a DirectML-based reference implementation
IIRC DirectML isn't viewed as a real time API so using it for real time rendering is very unlikely.

Resolution scaling is a relatively simple 2D image manipulation function in the grand scheme of things.
These upscalers have zero to do with "resolution scaling". Which is a 2D image manipulation function which can be handled by the h/w and does fit into the line of MSAA and AF.

It is nowhere in the same ballpark of complexity or lines of code as physics middleware.
Runtime performance suggest otherwise. DLSS is more complex to run than some PhysX GPU physics from 10 years ago.

Btw I'm not sure why fixed function h/w functionality is relevant here. A lot of DirectX's behavior and APIs are implemented in software on both the CPU and GPU.
It is not but all these "behaviors" are basic functions which is different in case of DLSS/XeSS and even FSR2 which are literally s/w products which are free to use only because they are sponsored by IHVs who made them. If you try to imagine something similar but not made by an IHV you'd arrive at a text book definition of a middleware - something which is licensed from a 3rd party to fit into some part of a game engine. Making an API specifically for that seems weird and unneccesary.
 
Btw I'm not sure why fixed function h/w functionality is relevant here. A lot of DirectX's behavior and APIs are implemented in software on both the CPU and GPU.
If you are looking for past DirectX behavior to compare DirectSR a better comparison would be the decompression part of DirectStorage. IHVs can apply their own decompression algorithm optimized for their hardware in the driver but if you want you can also use a reference implementation from Microsoft. You can also roll your own if you have special content that can benefit from specialized algorithms. All those solutions are just compute shaders/software, the one from the IVHs is just optimized for specific architectures. So these can be seen as middleware, there even exists some middleware for transcoding textures on the gpu (e.g. the new Spark library). The difference here is that with DirectStorage Micorsoft was ahead of the IVHs with an unified API and with upscaling Microsoft is too late to the game.
 
These upscalers have zero to do with "resolution scaling".

We clearly have fundamentally different views on this topic.

In other somewhat relevant news:

 
Back
Top