Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

Encouraging, but we all know the reality of PR vs accuracy. God knows we get enough Nvidia marketing here let alone tech sites.
This is the first time DLSS was delivered at variable resolutions from 720p to 8K, and the first time it didn't put restrictions on the resolution or the GPU used, previously it restricted 1080p DLSS to only work on the 2060/2070, in some games the 2080Ti only had access to 4K DLSS. All of these problems were a result of the old model of per game training, getting rid of them aligns with the claim that highly specific training is no longer needed, DLSS is now also delivered in 3 user selectable settings, which was not possible in the old model, which further reinforces their claim.

Anyway, the only thing that matters to users is the quality delivered and the amount of performance uplift, the rest are logistics that NVIDIA needs to worry about, not us users.
 
Anyway, the only thing that matters to users is the quality delivered and the amount of performance uplift, the rest are logistics that NVIDIA needs to worry about, not us users.
It's certainly looking like a feasible working product now, which is great for Nvidia users going forward. Hopefully those logistics can bring support for it in a much shorter time to market than before.
 
I hope they don't bother, I don't want some algorithm (tensors or not) guesstimating what I should see on my screen, I want artist deciding what should be on the screen.

I agree but isn’t the end result more important? If DLSS can guesstimate an image that is higher resolution than native then more power to Nvidia.

Emotionally speaking though DLSS2X would be very welcome.
 
Some 1440p comparison shots from Bright Memory (UE4 indie game) with DLSS, no AA and FXAA:

Comparisons:
DLSS & No AA https://cdn.knightlab.com/libs/juxt...html?uid=f1430580-54a5-11ea-b9b8-0edaf8f81e27
DLSS & FXAA https://cdn.knightlab.com/libs/juxt...html?uid=0f0c1506-54a7-11ea-b9b8-0edaf8f81e27

Originals:
https://ibb.co/G3gMJcX
https://ibb.co/tJB092M
https://ibb.co/KDHv49y

The nvngx_dlss.dll version is 2.0.3.0 so a bit newer than Youngblood. Only on/off toggle available for DLSS, no idea what the internal resolution here is.

Those tree branches in the fore are in constant motion. Seeing how DLSS handles the foliage quite nicely makes me think it would have been good to have this in RDR2 with it's low framerates and poor AA that blurs the trees pretty badly.
 
It's a shame they don't allow a comparison against the state of the art of UE4 AA.

Especially since they mostly outperformed iD engine's TSSAA there's a good chance they'd do the same for TAAU. This artificial hamstringing of Unreal Engine games on PC is annoying. You can bet your ass the console game has TAAU (and dynamic resolution, but that's a bit hard on PC due to timing issues).
 
Yeah it's a sad state on PC if the only AA method they included was FXAA. It's especially bad for PC gamers if devs are going to end up not caring about AA because "Nvidia will take care of it for us"
 
Yeah it's a sad state on PC if the only AA method they included was FXAA. It's especially bad for PC gamers if devs are going to end up not caring about AA because "Nvidia will take care of it for us"
They still need to build support for TAA, DLSS requires it to work.
 
TAA is still behind the times compared to TAAU though.

As I said, DLSS looks very favourable compared to TSSAA ... so it can probably still win against TAAU too. With a smaller margin, but also with less of a foul taste.
 
https://www.techspot.com/article/1992-nvidia-dlss-2020/

Did nvidia ever confirm that DLSS is back on tensors?

Concretely, switching back to tensor cores and using an AI model allows Nvidia to achieve better image quality, better handling of some pain points like motion, better low resolution support and a more flexible approach. Apparently this implementation for Control required a lot of hand tuning and was found to not work well with other types of games, whereas DLSS 2.0 back on the tensor cores is more generalized and more easily applicable to a wide range of games without per-game training.
 
https://www.techspot.com/article/1992-nvidia-dlss-2020/

Did nvidia ever confirm that DLSS is back on tensors?
Yeah, with Wolf The Young Blood and Deliver us the Moon it is back on tensor cores. You can even run NVtrace on it and compare i with Control to see the int8 usage difference and where it happens (end of the frame).
Pretty sure they will talk about it at GDC.
It's a shame they don't allow a comparison against the state of the art of UE4 AA.

Especially since they mostly outperformed iD engine's TSSAA there's a good chance they'd do the same for TAAU. This artificial hamstringing of Unreal Engine games on PC is annoying. You can bet your ass the console game has TAAU (and dynamic resolution, but that's a bit hard on PC due to timing issues).

There is an "unreal engine unlocker" available on PC made by Frans Bouma which allows you to get console back and most of the commands as well, if you wanted.
 
There is an "unreal engine unlocker" available on PC made by Frans Bouma which allows you to get console back and most of the commands as well, if you wanted.
Oh nice this also allows changing DLSS quality modes in Bright Memory which only has an on/off toggle in the menu. :yes:

Sadly I cant inject this when running the game through GPU trace. Would have been interesting to see the a GPU trace frame profile since the different DLSS modes seem to have a different frametime cost. For example 1080p to 1620p (quality mode) runs faster than 1080p to 2160p (performance mode). In Control the output resolution didn't seem to have an impact on performance (tested only briefly though).
 
Sadly I cant inject this when running the game through GPU trace. Would have been interesting to see the a GPU trace frame profile since the different DLSS modes seem to have a different frametime cost.
Running as admin was required... :rolleyes:

Looks like the DLSS 2.0 processing time is wholly dependent on output resolution. Tensor cores were active for about 1.1ms on both 720p to 1440p and 960p to 1440p modes. 0.7ms for 720p to 1080p.
 
Very high chance of my next gpu being Nvidia if amd doesn’t offer something similar. Anything that’ll keep me close to 144Hz with 1440p output.
 
DLSS 2x 'do it yourself' using virtual super resolution :) :
https://www.pcgameshardware.de/Nvid...orce-Tuning-Tipps-DLSS-2x-aktivieren-1346292/
Some nice upscaled comparison shots using different settings.

Seeing moire patterns remain no matter what crazy down sampling you try reminds me on the idea of UV jittering for texture sampling.
I guess in combination with TAA this could give some advantages for free, similar to the subpixel jitter AA we know from raytracing.
It should fix moire issues, but eventually needs one more texel of texture dilation to prevent issues at seams. (Eventually it could also confuse and hurt things like DLSS.)

I wonder if this has been done already somewhere.
 
DLSS 2x 'do it yourself' using virtual super resolution :) :
https://www.pcgameshardware.de/Nvid...orce-Tuning-Tipps-DLSS-2x-aktivieren-1346292/
Some nice upscaled comparison shots using different settings.

Seeing moire patterns remain no matter what crazy down sampling you try reminds me on the idea of UV jittering for texture sampling.
I guess in combination with TAA this could give some advantages for free, similar to the subpixel jitter AA we know from raytracing.
It should fix moire issues, but eventually needs one more texel of texture dilation to prevent issues at seams. (Eventually it could also confuse and hurt things like DLSS.)

I wonder if this has been done already somewhere.

Would be nice if they did some performance comparisons to estimate the native rendering resolution at various DSR factors. I don't think Nvidia has published scaling factors for each DLSS quality level.
 
Back
Top