Nvidia DLSS 3 antialiasing discussion

Scott_Arm

Legend
DLSS 3 is putting GPUs on a nice path. Work smarter, not harder. Increasing raw performance is becoming more and more ridiculous. On top of that, it looks like they've addressed (hopefully) some of the cpu-side issues by making BVH building faster and less memory intensive with the 40 series. DLSS now helps in cpu limited situations, so with nvidia reflex added in you no longer have to care about anything if you're playing something competitive. You turn on reflex, you turn on DLSS 3 and away you go. Hopefully they actually release a 4050/4060 with capable DLSS 3 support for playing on 1080p screens. I bet with DLSS3 1080p500 monitors could be doable for quite a few games with low/medium settings.

Edit: To clarify this a bit more, a lot of people play competitive games with low settings for visual clarity, but then you end up cpu limited. In those cases, DLSS did nothing for you so competitive gamers normally turned it off and considered it a useless feature. Now you can get an improvement in frame rate from DLSS 3 in that scenario, and you have reflex on top to handle the input latency situations when you're gpu gets near 100% because of particular visuals. It's a winning combo.
 
Last edited:

gamervivek

Regular
ATM I am not working *directly* on a follow-up video but I already have done some performance capture and such on non-intel GPUs as to prep for such a video in the future. I also made some quality comp videos for the non-intel arc version of XeSS.

But to be blunt, I am a bit tired of working on IQ comparison videos. I feel like I have done soooo many in the last few months... I am looking forward to the eventual RTX 4*** series launch whenever that is because it hopefully means I can get away from all this IQ stuff for a bit lol

Real sneaky there lol

 

DegustatoR

Veteran
Ehm, I'm not sure that it's wise to create a separate DLSS3 thread as DLSS3 seem to be an extension of DLSS2 SDK - meaning that games with DLSS will still be "games with DLSS" but Ada frame interpolation feature will simply be greyed out on Turing and Ampere.
 

BRiT

(>• •)>⌐■-■ (⌐■-■)
Moderator
Legend
Alpha
Ehm, I'm not sure that it's wise to create a separate DLSS3 thread as DLSS3 seem to be an extension of DLSS2 SDK - meaning that games with DLSS will still be "games with DLSS" but Ada frame interpolation feature will simply be greyed out on Turing and Ampere.
92 pages and 4 years is more than long enough.

But this should be where the new Frame Interpolation feature is focused. There has to be some balance so new posters can find a reasonable jump-in-point to a topic.
 

DegustatoR

Veteran
92 pages is more than long enough.

But this should be where the new Frame Interpolation feature is focused. There has to be some balance so new posters can find a reasonable jump-in-point to a topic.
Then you should close the old thread and link to it from the OP in this one I think.
 

BRiT

(>• •)>⌐■-■ (⌐■-■)
Moderator
Legend
Alpha
Then you should close the old thread and link to it from the OP in this one I think.

Not a bad idea at all, so I'm pondering that. I didn't want to close it down immediately, to see how things take shape.
 

gamervivek

Regular
This can be used in games that don't have DLSS2 or even TAA for that matter. Probably under a different branding.

Also, Turing/Ampere did have something for Optical flow, but not good enough for realtime?

Optical Flow functionality in Turing and Ampere GPUs accelerates these use-cases by offloading the intensive flow vector computation to a dedicated hardware engine on the GPU silicon, thereby freeing up GPU and CPU cycles for other tasks. This functionality in hardware is independent of CUDA cores..

 

gamervivek

Regular
Should have been more clear, DLSS3 would be using motion vectors and optical flow both, but nvidia can expose only the latter in their drivers for any game like they did with DLDSR.
 

DegustatoR

Veteran
Should have been more clear, DLSS3 would be using motion vectors and optical flow both, but nvidia can expose only the latter in their drivers for any game like they did with DLDSR.
But would it work? Or does the interpolation require the data feeded into DLSS/DLAA to work?
 

gamervivek

Regular
It is entirely separate from the motion vectors needed for DLSS2, the link mentions that it's better for effects while motion vectors are better for geometry. Though it might not be as good without them, like the TAA implemented in one of the reshade shaders that tried to work without them but ended up just blurring the image in most cases.

edit: from the Ada thread, it should be quite good even as a standalone.

@yamaci17 There is a new hardware unit in the RTX 40 series called the optical flow generator. It is a requirement of DLSS 3.0. For some context:

 
Last edited:

DegustatoR

Veteran
It is entirely separate from the motion vectors needed for DLSS2, the link mentions that it's better for effects while motion vectors are better for geometry. Though it might not be as good without them, like the TAA implemented in one of the reshade shaders that tried to work without them but ended up just blurring the image in most cases.

edit: from the Ada thread, it should be quite good even as a standalone.
Hm as I understand this it is basically required to supply motion vectors with the frame for this feature to work as it predicts movements based on motion vectors. This means that it is likely to use the same data as DLSS does and wouldn't really work without it - at least the predication approach wouldn't.

What you're saying is that you could implement an interpolation between frames through OF by calculating the movement vectors via comparing two frames. This is how frame interpolation works basically but such approach is totally different as it would a) add input lag and b) produce artifacts most likely.
 

hkultala

Regular
DLSS 3 is putting GPUs on a nice path. Work smarter, not harder. Increasing raw performance is becoming more and more ridiculous. On top of that, it looks like they've addressed (hopefully) some of the cpu-side issues by making BVH building faster and less memory intensive with the 40 series. DLSS now helps in cpu limited situations, so with nvidia reflex added in you no longer have to care about anything if you're playing something competitive. You turn on reflex, you turn on DLSS 3 and away you go. Hopefully they actually release a 4050/4060 with capable DLSS 3 support for playing on 1080p screens. I bet with DLSS3 1080p500 monitors could be doable for quite a few games with low/medium settings.

Edit: To clarify this a bit more, a lot of people play competitive games with low settings for visual clarity, but then you end up cpu limited. In those cases, DLSS did nothing for you so competitive gamers normally turned it off and considered it a useless feature. Now you can get an improvement in frame rate from DLSS 3 in that scenario, and you have reflex on top to handle the input latency situations when you're gpu gets near 100% because of particular visuals. It's a winning combo.

DLSS3 just makes input lag worse, so it's just bad for any competitive gaming.

That it gives more FPS is irrelevant when those extra FPS comes at cost of worse input lag.
 

Scott_Arm

Legend
DLSS3 just makes input lag worse, so it's just bad for any competitive gaming.

That it gives more FPS is irrelevant when those extra FPS comes at cost of worse input lag.

If you're playing a game like warzone and you have 200 fps and you can double that to 400 fps with DLSS 3 to take advantage of a new 500Hz monitor, which is worse?

In the case of CS or Valorant you wouldn't want it because you can hit the refresh rate of the display or more anyway.
 
Top