Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

I've been doing some tests on my 3090 and the bundled CoD game, with DLSS 2.0 and raytracing at 4K.
It's still as terrible as it has ever been on the high contrast near horizontal and near vertical borders, like from windows and doors.
Horrible under sampling of those edges and terrible staircasing aliasing artifacts (and that with highest quality DLSS)
It immediately throws you out of the illusion you are looking at a high resolution rendered game.
For me a no go, the 3090, is fast enough to not use DLSS, occasionally it dips below 60 FPS at true 4K, but I prefer that to poor rendering quality.
 
It's disappointing DLSS 2.0 does not appear to be universally great lately, as seen with CoD, War Thunder and Watch Dogs.

@Voxilla have you updated your drivers?
 
I'll say left is DLSS, lettering on some of the articles pinned on the command "board" seems more legible, less blurred.
 
I'd say left is native, because the reflection looks better.
Both are DLSS Quality, left one with default texture lod bias, right one with lod bias set to -3 in inspector.
In UE4, you can choose mip bias for TAAU - https://docs.unrealengine.com/en-US/Engine/Rendering/ScreenPercentage/index.html
Negative one is necessary to match higher resolution rendering after reconstruction. The same applies to DLSS.
Here is Native (left one) with default driver lod bias vs DLSS + lod bias set to -3 in inspector (right one) - https://imgsli.com/MzA4NzE
 
Both are DLSS Quality, left one with default texture lod bias, right one with lod bias set to -3 in inspector.
In UE4, you can choose mip bias for TAAU - https://docs.unrealengine.com/en-US/Engine/Rendering/ScreenPercentage/index.html
Negative one is necessary to match higher resolution rendering after reconstruction. The same applies to DLSS.
Here is Native (left one) with default driver lod bias vs DLSS + lod bias set to -3 in inspector (right one) - https://imgsli.com/MzA4NzE
Very cool find - thanks for that!
 
VSYNC is locked on purpose because the tearing was horrible in this video. Most of the time the GPU usage is at their limit, so it wouldn't really surpass 60fps anyways that often. DLSS 2.1 Ultra Performance was used because thats the only preset with a solid 60 FPS experience, most of the times at least. Screen Space Reflection setting is still bugged, I couldn't set it to max thats why it is on low.
Call of Duty: Black Ops Cold War 8K | RTX ON | RTX 3090 | i9 10900K 5.2GHz | Ultra Settings | DLSS - YouTube
 
Last edited by a moderator:
I like the fact that he was "torn" about how to review cards since one has an upscale solution and the other doesn't. It was interesting having his opinion about that.
notice they tested Death stranding but didn't use Fidelity cas ?

Found it really odd since they sing the praises of sony's checkerboard but I find Fidelity cas to do a better job than that.
 
notice they tested Death stranding but didn't use Fidelity cas ?

Found it really odd since they sing the praises of sony's checkerboard but I find Fidelity cas to do a better job than that.

I didn't even known it was a thing. Why AMD is not pushing this instead of a new upsampling tech that is not ready yet ?
 
notice they tested Death stranding but didn't use Fidelity cas ?

Found it really odd since they sing the praises of sony's checkerboard but I find Fidelity cas to do a better job than that.

Maybe because they have tested CAS before in the game and found it a bit lacking compared to DLSS 2.0?
(One of his points is that DLSS 2.0 in Death Stranding gives image quality than native resolution, something CAS does not).
 
Maybe because they have tested CAS before in the game and found it a bit lacking compared to DLSS 2.0?
(One of his points is that DLSS 2.0 in Death Stranding gives image quality than native resolution, something CAS does not).

DLSS also introduces artifacts and removes some effects something that Cas doesn't do.


Its odd because using it i have to say I agree with Hardware unboxed.

Digital foundry only reviewed it once

Its been over a year but they seem to be fine doing DLSS through all its changes.
 
Last edited:
Let's be honest, unless we have blind tests done there is going to be a lot of bias in terms of what actually subjectively "looks better."

At least my general impression is that opinions on upscaling techniques tends to have a correlation with ones feelings with respect to the vendor/platform that upscaling technique is associated with.
 
Maybe because they have tested CAS before in the game and found it a bit lacking compared to DLSS 2.0?
Many reviews found DLSS (the original one) subpar to CAS and despite it, the two were considered and tested as alternative. Number of reviews focused on detailed DLSS 2.0 / CAS comparision is very limited, despite it CAS seems to be getting ignored.
 
DLSS also introduces artifacts and removes some effects something that Cas doesn't do.


You take it up with Digital Foundry then?
Their stance is DLSS 2.0 gives better than native image quality (overall) and CAS gives worse (overall) and they present good examples of why.

You do not deny that, but you try a "whataboutism", that is not a "debate" I have any desire to go into (aka waste of time).
 
You take it up with Digital Foundry then?
Their stance is DLSS 2.0 gives better than native image quality (overall) and CAS gives worse (overall) and they present good examples of why.

You do not deny that, but you try a "whataboutism", that is not a "debate" I have any desire to go into (aka waste of time).

I've pointed out other reviewers claiming different. Even in the video you posted about DLSS they compared it to ps4 pro checkerboard but not radeon fidelity cas + upscaling.

I'm just saying if they are willing to use use upscaling and sing its praises in dlss videos or ps5 bc videos why aren't they using it for the rx 6800 series.


its already bad enough that it was what 2 weeks for us to get a video on the 6800 series but they couldn't be bother to test a feature ?
 
Back
Top