Intel XeSS anti-aliasing discussion

I feel like death looking at that thumbnail.

Imagine being a PHD computer scientist putting your day's work into an implementation for years and you see the word "fail" next to it on YouTube for some minor visual artefacting. Clearly, there are better ways to describe visual artefacts that require less hyperbole and "shitting on".
 
I feel like death looking at that thumbnail.

Imagine being a PHD computer scientist putting your day's work into an implementation for years and you see the word "fail" next to it on YouTube for some minor visual artefacting. Clearly, there are better ways to describe visual artefacts that require less hyperbole and "shitting on".
I know right. This is just soooo disrespectful to the engineers at Intel and it makes me pretty angry I admit. This is how HUB gets their clicks, they want you to feel strong emotions, instead of reviewing the hardware in a neutral and objective manner. They always use words like terrible, fail, unuseable, shitty etc in their wording.
 
Seems a bit strange leaving out Arc GPUs from the comparison, having all three vendors with their own solutions in the same video would make it comprehensive (from a performance standpoint at least). Granted they did say they'll do Intel in another video and it is already 30+ mins, maybe a performance comparison video then an image quality comparison video would've been more cohesive? I guess we'll see what they do later
 
well, whenever I see a recommended video of them I am going to click on "Don't recommend this channel". I did the same with the JayzTwoCents channel back in the day when he accused nVidia of using cheap capacitors that were causing issues with their RTX 3000 GPUs when they were launched, that allegedly caused those GPUs to freeze and so on, just because another person, an actually knowledgeable guy theorized about that -he was wrong, but that guy at least created that theory himself and he usually knows what he's talking about even if he might be wrong.
 
I feel like death looking at that thumbnail.

Imagine being a PHD computer scientist putting your day's work into an implementation for years and you see the word "fail" next to it on YouTube for some minor visual artefacting. Clearly, there are better ways to describe visual artefacts that require less hyperbole and "shitting on".
Imagine Dr Cutress who spent 3 years on doing Phd in chemistry but now he makes money on YouTube testing computers.
 
List of games that support Intel XeSS (list updated september 20th):


As long as your GPU has dp4a support, then it is fully compatible to use XeSS on any game. This is XeSS running on a GTX 1630:

 
List of games that support Intel XeSS (list updated september 20th):


As long as your GPU has dp4a support, then it is fully compatible to use XeSS on any game. This is XeSS running on a GTX 1630:

It doesn't even need hardware DP4a support, just Shader Model 6.4. DP4a accelerates it though. Meaning it should work on Navi21 (RX5700 etc) too (only Navi without DP4a I think?)
 
It doesn't even need hardware DP4a support, just Shader Model 6.4. DP4a accelerates it though. Meaning it should work on Navi21 (RX5700 etc) too (only Navi without DP4a I think?)
afaik, RX 5700 doesn't have dp4a support, yeah. As for what you mention, I've seen videos using XeSS natively on a GTX 1060. The overall speed is better than native although it doesn't perform like on dp4a compatible or native Intel gpus.

In this video, they mention the fact of making XeSS part of the GPU drivers -video should start playing at that very moment-. I wonder if they could make XeSS work on all games forcing it via drivers. I'd certainly enable it most of the time.


On another note:
 
So XeSS XMX (on Arc GPUs) is far superior to FSR2.1, and close to DLSS2. It's funny that some people here still doubt the efficacy of AI upscaling compared to traditional methods.

For those in doubt, this is ranking of the 3 upscaling methods:
DLSS2
XeSS XMX
FSR2

When upscaling from 1080p to 4K, both DLSS2 and XeSS XMX are even more far ahead of FSR2.
 
just enabled XeSS in Shadow of the Tomb Raider and if I could use it in all the games I wouldn't use anything else -save DLSS too if I could-. Now I can quite understand why DF recommends these reconstruction techniques over native.
NEVER seen such a clean game of jaggies, :)it feels like watching an animated movie in the cinema at times.

I'm not talking about graphics realism, and there are more graphically impressive games than Tomb Raider, but the cleanliness of the image is such that not even in the best games -except CoD Modern Warfare 2 that also uses XeSS- that I've seen, I looked at an image so clean-

I remember when I saw Digital Foundry's XeSS video, I was wondering what the trick was because they were upscaling a native 720p image -which is all blur and jaggies- using XeSS and on the upscale you saw the image without a single jaggie even on super thin elements. Specifically this image (and that's in Performance mode):

69v3FEi.png


Rnbn4wv.png
 
Last edited:
that being said, not so good news, I reported in the Intel support forums that XeSS doesn't work with Shadow of the Tomb Raider in the most recent beta version of the drivers, for whatever reason. I had to install the "old" last stable version from October so I could use XeSS in Shadow of the Tomb Raider.

I also noticed that at 4K, using XeSS Performance or Balanced, Lara's hair disappeared at times when running the benchmarks. Oddly enough that didn't happen the first time I benchmarked the game. I am playing the game using XeSS Quality and Raytracing shadows at ultra. The framerate is good.
 

And most importantly, XeSS support for one of my favourite games ever, Resident Evil 2 Remake.

 
I got both Judgment and Lost Judgment by Sega, which feature XeSS.

Character's hair has some strange halo around it but I guess it's an Intel A770 artifact, not an issue with the game. It happens whether the game is running at 4K native resolution or XeSS.

This is a comparison between native 4K and XeSS Quality. The hair's aliasing is gone with XeSS Quality. 4K native shows a crisper texture specially on the leather jacket, but that's to be expected. XeSS Quality is usually my favourite setting, but any setting is ok as long as I dont have to play 4K native, which uses resources in excess with no clear IQ gain compared to XeSS Quality and Ultra Quality, plus worse AA.

The comparison features the same camera angle and position, only light changes slightly.

XeSS Quality

uQBVz9M.jpg


Native 4K

os9yFPc.jpg


Does anyone know if that issue with the hair happens with other GPUs?
 
Modders took the occasion of the Skyrim's 11th anniversary on 11-11-2022 to create a mod that adds FSR 2, DLSS and XeSS support to Skyrim.


More detailed info with a video from an articulate guy in the DLSS thread.

 
Last edited:
Back
Top