Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

At least Deliver us the Moon has built in resolution scale yet they don't bother doing the most relevant test - is DLSS any better than what you can achieve with raw scaling and little sharpening.
I think @Dictator best described it in this post:
Just because I am a stickler, AMD RIS does something vastly different than DLSS. It is merely an image sharpener, a smarter one, but nothing more. It cannot generate pixels, smooth lines, or do anything that image reconstruction like checkerboarding or DLSS will do. It actually increases aliasing.
 
I think @Dictator best described it in this post:
Of course it's just an image sharpening tool, point was that at least previously that "mere image sharpening tool" paired with lower rendering resolution has matched or sometimes even beaten DLSS in both quality of final shot and performance (and IIRC NVIDIAs sharpening could achieve similar results to AMDs RIS)
 
Of course it's just an image sharpening tool, point was that at least previously that "mere image sharpening tool" paired with lower rendering resolution has matched or sometimes even beaten DLSS in both quality of final shot and performance (and IIRC NVIDIAs sharpening could achieve similar results to AMDs RIS)
TBH it never was "in both", in always was either / or - you could've gotten either better performance than DLSS or better quality, not both.
People are way to fast to trust those who say that DLSS "v1" was universally bad. It wasn't, and there were more than enough examples where it did actually do better than game's own TAA at fighting aliasing.
 
TBH it never was "in both", in always was either / or - you could've gotten either better performance than DLSS or better quality, not both.
People are way to fast to trust those who say that DLSS "v1" was universally bad. It wasn't, and there were more than enough examples where it did actually do better than game's own TAA at fighting aliasing.
When the image quality is better (as it was at least before they reverted to shader-DLSS) and performance is within error margin (FPS or two) I'd call it "in both". Of course there were specific scenarios where DLSS was shining, but that wasn't the overall picture in those early big DLSS games. The problem for DLSS is that it needs to go lower resolution to match the performance of normal scaling, which makes it harder for it to match the quality when you have less data to build it from
 
I think a comparison of Nvidia's image sharpener would be more appropriate for inclusion of AMD's RIS.
DLSS has always had a different categorization and function, and now with DLSS 2.0 providing improvements over ground truth image quality is typically something not experienced with "image sharpeners".
 
The problem for DLSS is that it needs to go lower resolution to match the performance of normal scaling, which makes it harder for it to match the quality when you have less data to build it from
This is no longer the case with the new DLSS version, it provides faster performance than resolution scaling, and far better image quality to the point of rivaling native resolution or exceeding it.

Image Sharpening is not even in the same league and can be used in conjunction with DLSS if that is your thing.
is DLSS any better than what you can achieve with raw scaling and little sharpening.
Yes it is, this was tested in the latest analysis from Hardware Unboxed.
 
One is native with TAA, the other one is DLSS:

12_wolfenstein_youngblood_rtx_test_wydajnosci_ray_tracingu_i_dlss_nc3_b.jpg


12_wolfenstein_youngblood_rtx_test_wydajnosci_ray_tracingu_i_dlss_nc1_b.jpg


Shouldn't be hard to say which is which, right?
 
This is no longer the case with the new DLSS version, it provides faster performance than resolution scaling, and far better image quality to the point of rivaling native resolution or exceeding it.
So you're suggesting that AI scales and reconstructs an image faster than basic GPU scaling? That sounds awfully unlikely. Pre-shader-DLSS era DLSS costs about 10 % performance (I think it was someone from UL Benchmarks (Futuremark) who said it), now you're suggesting it would actually improve performance by itself?
Then why bother with the scaling, just forget the scaling part, get DLSS X2, free anti-aliasing and improved performance, if what you're suggesting is true.

edit:
@DegustatoR top DLSS? Simply based on the what seem to be anomalies in lighting. (Haven't played the game myself so no idea how it's supposed to look there, could be what I think are anomalies are supposed to be there too and the other one is lacking something)
 
Last edited:
Top is DLSS. Over-emphasises lines and has some nasty ringing artifacting that betrays any kind of convolution job. Bottom one does the TAA thing were any single frame looks way smoothed out than it seems in motion.
 
So you're suggesting that AI scales and reconstructs an image faster than basic GPU scaling? That sounds awfully unlikely. Pre-shader-DLSS era DLSS costs about 10 % performance (I think it was someone from UL Benchmarks (Futuremark) who said it), now you're suggesting it would actually improve performance by itself?
You guys have some sort of a misunderstanding.

50% resolution scale is obviously faster than 50% resolution scale + DLSS processing (aka DLSS performance). Maybe @DavidGraham means DLSS is faster at comparable image quality (ie. 75% raw vs 50%+DLSS)?

Btw those shots @DegustatoR posted are DLSS Balanced and TSSAA 8x. Quality mode has less artifacting.
 
The game has sharpening strength setting, it can be adjusted to remove the ringing altogether. DLSS is much sharper by default, hence default TSSAA 8x's sharpening strength doesn't make any sense for DLSS.

It's not the typical "oversharpened" ringing that you can get rid off by tweaking sharpening settings. Look around the windows on the left or any other high-delta transition and you'll see a periodic ringing.
 
now you're suggesting it would actually improve performance by itself?
I am not suggesting, I am stating it, that's what Hardware Unboxed tests revealed.

In the case of DLSS @4K:

If against 1800p scaling (I assumed this is what you meant originally), performance is faster and image quality is better.

If against 1440p scaling, performance is almost the same, (the difference of 83fps vs 80fps) while image quality is way better.

And the old DLSS costs about 30% not 10%.
 
Last edited:
And the old DLSS costs about 30% not 10%
This depends on the framerate because DLSS processing cost is closer to a fixed time per each frame than a steady percentage of frametime.

So in low framerate situations (3DMark is a good example) the x milliseconds is to closer a 10% addition to frametime and high framerate situations closer to 30%.

This is why DLSS provides such massive gains at 4K and less significant at 1080p in pretty much every benchmark out there.
 
One is native with TAA, the other one is DLSS:

Shouldn't be hard to say which is which, right?
I don't know which is which, but the second one loses a lot of the detail centre screen. I'd prefer the first one, all other things being equal.

upload_2020-2-18_14-35-47.png

Without being well versed in the two methods, I'd have assumed the bottom (second) image was DLSS hence it missing the fine details like it did in the early days with 'painterly' artefacts.
 
I prefer the top image over the bottom. i also believe the top is DLSS and the bottom is TAA.
I've always found TAA to feel smooth. DLSS always had a weird vibe to the image, like it wasn't a traditional output from modern AA techniques. but in this case, you can see significantly more detail.
 
Back
Top