Intel XeSS anti-aliasing discussion

Wow I am really surprised DirectSR is coming with an included implementation. Is there any other feature of DirectX that includes a GPU implementation in the runtime? It will be even easier for developers to include upscaling in their games now.
Haha so it's just FSR2 built into DX which is essentially useless.
But the biggest question of how a game would "enumerate" other options remains unanswered.
Will these options be a part of the driver now in which case AMD users won't be able to use XeSS?
 
Haha so it's just FSR2 built into DX which is essentially useless.
But the biggest question of how a game would "enumerate" other options remains unanswered.
Will these options be a part of the driver now in which case AMD users won't be able to use XeSS?

Good question. The driver enumeration approach would be better for Intel and Nvidia users because devs wouldn’t be required to opt-in to DLSS and XeSS. But that also means no XeSS for AMD users.

I hope it’s the former because requiring devs to bundle proprietary DLLs and deal with all the IHV marketing nonsense is no better than what we have today and will limit adoption. It also opens the door for IHVs to improve upscaling quality of older games that are no longer being updated by the developer.
 
It also opens the door for IHVs to improve upscaling quality of older games that are no longer being updated by the developer.
Newer versions of DLSS don't necessarily improve upscaling quality in older games, and they also provide several options for a game to choose from. XeSS is similar I guess. All of this would lost with driver side implementation?
The way it is done now is IMO more convenient for the user than shipping just one version for all games with the driver.
And I honestly don't care about developers who can't manage to add an SDK of 5 DLLs to their projects.
 
Newer versions of DLSS don't necessarily improve upscaling quality in older games, and they also provide several options for a game to choose from. XeSS is similar I guess. All of this would lost with driver side implementation?
The way it is done now is IMO more convenient for the user than shipping just one version for all games with the driver.

It doesn’t have to be just one version in the driver. Drivers already ship tons of custom application profiles.

We’ve covered all of this before. The pros of a common SDK + driver side implementations far outweigh the cons.

And I honestly don't care about developers who can't manage to add an SDK of 5 DLLs to their projects.

Well that just means you don’t have a full appreciation of what it takes to develop and ship software. /shrug
 
It doesn’t have to be just one version in the driver. Drivers already ship tons of custom application profiles.
But if that would be the case then the idea of the driver providing improvements with updates goes out the window.

We’ve covered all of this before. The pros of a common SDK + driver side implementations far outweigh the cons.
I don't see any "pros" still.

Well that just means you don’t have a full appreciation of what it takes to develop and ship software. /shrug
No, it just means that the user side of this is way more important to me than the developers who can't find a week of time to integrate these SDKs.
 
Also of note in this discussion:

1.png


AMD marks the ability to integrate FSR 3.1 via an API which "unlocks" "upgradability" via DLLs as a benefit of the FSR 3.1.
If DirectSR will prevent this from being possible then it goes against this slide as well.
 
Also of note in this discussion:

1.png


AMD marks the ability to integrate FSR 3.1 via an API which "unlocks" "upgradability" via DLLs as a benefit of the FSR 3.1.
If DirectSR will prevent this from being possible then it goes against this slide as well.
so if I understand that correctly, you can decouple FSR 3 Frame Generation from FSR and use DLSS + FG on RTX 2000 and RTX 3000 cards and XeSS + FG on Intel and whatever GPUs? That's excellent news if true
 
so if I understand that correctly, you can decouple FSR 3 Frame Generation from FSR and use DLSS + FG on RTX 2000 and RTX 3000 cards and XeSS + FG on Intel and whatever GPUs? That's excellent news if true
Yep. Tying FSR FG to FSR SR was one of the weird things of FSR3.
 
XeSS 1.3 out

More details, less ghosting, better temporal stability, new presets (and new scaling factors for old ones too, apparently with the idea that new and old offer similar quality with new performing better)
PresetResolution scaling in
previous XeSS versions
Resolution scaling
in XeSS 1.3
Native Anti-AliasingN/A1.0x (Native resolution)
Ultra Quality PlusN/A1.3x
Ultra Quality1.3x1.5x
Quality1.5x1.7x
Balanced1.7x2.0x
Performance2.0x2.3x
Ultra PerformanceN/A3.0x
 
Last edited:
Would like to try XeSS 1.3 on my old 1070MQ laptop; I was pretty happy with XeSS 1.2 in Cyberpunk on that rig. I guess I shouldn't hold my breath for the patch tho :/
 
new scaling factors for old ones too, apparently with the idea that new and old offer similar quality with new performing better
It just means they rearranged their scaling factors.

Old XeSS 1.2 Performance had scaling of 2x, meaning 1080p to 2160p, but the new XeSS 1.3 Performance goes to 2.3x, meaning ~900p to 2160p, which will deliver slightly faster performance as it renders lower resolution, which they show in their marketing slide.

O6Q0qyZ0qkm3M3Rd.jpg


The old Quality preset is 1.5x scaling, meaning 1440p to 2160p, the equivalent to that in the new preset is Ultra Quality, both deliver similar performance in Intel's testing (~50fps).

XeSS-1.3-new-presets-Cyberpunk-2077.png
 
Yeah, more important than the scaling factors is whether the image quality holds up with the "extra upscaling." Hence why I'd like to give it a shot :)
 
It just means they rearranged their scaling factors.

Old XeSS 1.2 Performance had scaling of 2x, meaning 1080p to 2160p, but the new XeSS 1.3 Performance goes to 2.3x, meaning ~900p to 2160p, which will deliver slightly faster performance as it renders lower resolution, which they show in their marketing slide.

O6Q0qyZ0qkm3M3Rd.jpg


The old Quality preset is 1.5x scaling, meaning 1440p to 2160p, the equivalent to that in the new preset is Ultra Quality, both deliver similar performance in Intel's testing (~50fps).

XeSS-1.3-new-presets-Cyberpunk-2077.png
Yes, but with the improved quality of 1.3 at least I got the impression old and new preset x offer similar quality with new one being faster due lower rendering resolution.
Not random shuffling.
 
No, it just means that the user side of this is way more important to me than the developers who can't find a week of time to integrate these SDKs.
I’ve worked on many FSR (2+3) integrations at this point, and properly integrating it (or XeSS or DLSS for that matter) takes a bit longer than that. Initial integration to get up and running can be quick, depending on engine architecture, but the polish to get to shipping quality can be time consuming.

DirectSR isn’t a panacea, but it can be a time and complexity and maintenance saving for developers that have other priorities. That’s who it’s aimed at primarily, and widening upscaling tech adoption in general is a good idea.
 
DirectSR isn’t a panacea, but it can be a time and complexity and maintenance saving for developers that have other priorities. That’s who it’s aimed at primarily, and widening upscaling tech adoption in general is a good idea.
On the other hand it even further solidifies interpolation, which is a dead end for VR.
 
The framerate at which interpolation is being used is rarely enough for VR without an extra frame of latency, with the extra frame it's a lost cause.

You can decouple view movement from the main framerate (with a bit of bodging for the stereo) but any interaction other than view movement will still have the latency.

A drop in TAA replacement is too brain dead, renderers have to get a lot smarter for good sample reuse for VR.
 
Back
Top