Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

A is DLSS. It's easy to tell from the nicely completed lines on the monitor and no jaggies at the edge of the lamp top middle.
 
@Man from Atlantis
For scene 1:
Your image examples are dead on and would have guessed the third image is DLSS Quality. The text on the sheet of paper inside the folder on the desk is more legible than the other images.
Also the round metal objects (fans) on the back wall, the edge detail seems to be more defined.
 
Blind test for DLSS... Feel free to tell me which is DLSS, which is native. The third one, you can guess what it is. These are not mine since I don't have a DLSS capable card... In case anyone knows where I got it from, I also scrambled them, to make sure it is different from its original source.

A:
QdXWz5E.jpg


B:
SBxie6A.jpg


C:
ohosrUE.jpg

A. DLSS Quality
B. Native
C. Native TAA

Images A has no jaggies around florescent lights, B with most and C with still some but looks like TAA is working.
 
Last edited:
A is much softer but has fewer jaggies. C is sharper with more jaggies. How do you tell which is DLSS and which is TAA?
 
Deliver Us the Moon, (my own shots)
2700X @PBO, 4*8GB 3066MHz CL12,
RTX 2060 @2025-2040/8000MHz

2560*1080 no AA
4ka8jku.png


2560*1080 TAA High
4ktaahighz5jrr.png


2560*1080 DLSS Quality
4kdlssaoj2n.png

2560*1080 no AA
noaapzjhs.png


2560*1080 TAA High
taahighokjnz.png


2560*1080 DLSS Quality
dlssqzzkxs.png

Thank you!

In Scene 1 DLSS wins for me as it's doing a better job on fine texture detail and on fine geometry (e.g. the lamp).

In Scene 2 TAA wins as DLSS has some weird artifacts. (e.g. the shadow of the yellow pole under the WSA sign on the wall is all effed up).
 
Thank you!
In Scene 2 TAA wins as DLSS has some weird artifacts. (e.g. the shadow of the yellow pole under the WSA sign on the wall is all effed up).
Are those artifacts just above the lettering on the WSA wall sign, or is that sunlight streaming through the gaps (grating) on the "WbIFY" sign? It's totally missing from the TAA High image, though appears in the No TAA and DLSS images.
Edit: Might be related to position of sun when images were taken.
 
Last edited:
Deliver Us the Moon, (my own shots)
2700X @PBO, 4*8GB 3066MHz CL12,
RTX 2060 @2025-2040/8000MHz

2560*1080 no AA
4ka8jku.png


2560*1080 TAA High
4ktaahighz5jrr.png


2560*1080 DLSS Quality
4kdlssaoj2n.png

2560*1080 no AA
noaapzjhs.png


2560*1080 TAA High
taahighokjnz.png


2560*1080 DLSS Quality
dlssqzzkxs.png

5120*2160
no AA
S3NOAA.png


TAA High
s3taahighxwkxf.png


DLSS Quality
s3dlssqevjpc.png

5120*2160
no AA
s4noaaq0j27.png


TAA High
s4taahigh9ojsw.png


DLSS Quality
s4dlssq8lkzf.png
 
This is correct.

So native is clearly trash and DLSS and TAA have trade offs depending on your distaste for jaggies or softness. No objective winner.

Are those artifacts just above the lettering on the WSA wall sign, or is that sunlight streaming through the gaps (grating) on the "WbIFY" sign? It's totally missing from the TAA High image, though appears in the No TAA and DLSS images.
Edit: Might be related to position of sun when images were taken.

Yeah you’re right. The weird shadowing is there on the native shot so it seems TAA is just blurring them away. In that case I would give the win to DLSS in scene 2.
 
Last edited:
In all the heat of the discussions going we forget what DLSS is all about ;) That dlss is competing with native images and sometimes even besting it says enough. Anyway, its there to improve performance to begin with.
And that's why I keep insisting (ad nauseum) that DLSS comparisons should be performed an iso-performance setup. The non-DLSS options will have to make some serious quality cutbacks to make up for the performance deficit.
 
And that's why I keep insisting (ad nauseum) that DLSS comparisons should be performed an iso-performance setup. The non-DLSS options will have to make some serious quality cutbacks to make up for the performance deficit.

Blame the current review scene. Nearly everyone just benches at ultra settings and calls it a day. There is little discussion about trade offs of different quality settings. I really only see that from HWUB and DF.

Back in the day [H]ardOCP really focused on best playable settings. Those days are long gone but it would be really interesting to see that methodology applied in a world where DLSS exists.
 
I also think it's kind of weird that gamers don't consider frame rate as a factor for image quality. They see a clear distinction between image quality and performance, where frame rate is performance. It's not really true at all. Temporal resolution and MPRT are real things and they're image quality problems.
 
An issue is that the advantages of temporal related gains are basically impossible to really convey or compare unless you effectively physically have the audience essentially "on site" so to speak. Since most (really almost all) discussion/content is purely online based this effectively just defaults to static comparisons as the path of least resistance.
 
An issue is that the advantages of temporal related gains are basically impossible to really convey or compare unless you effectively physically have the audience essentially "on site" so to speak. Since most (really almost all) discussion/content is purely online based this effectively just defaults to static comparisons as the path of least resistance.

That's reason for me to rely on reviewers who actually played the games. And now that I have 3070 I can make decisions for myself by figuring out what I like. People who just make decisions based on zoomed in stills are not very reliable source especially as often the zoomed in stills are from problematic areas and don't necessarily match 99% of the gaming experience.
 
Back
Top