Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

Today, NVIDIA has made Deep Learning Super Sampling (DLSS) easier and more flexible than ever for developers to access and integrate in their games. The latest DLSS SDK update (version 2.2.1) enables new user customizable options, delivers Linux support and streamlines access.

New User & Developer Customizable Options

This DLSS update offers new options to developers during the integration process. A new sharpening slider allows users to make an image sharper or softer based on their own personal preferences. DLSS Auto Mode enables optimal image quality for a particular resolution. For resolutions at or under 1440P, DLSS Auto is set to Quality, 4K set to Performance, and 8K set to Ultra Performance. Lastly, an auto-exposure option offers an automatic way to calculate exposure values for developers. This option can potentially improve the image quality of low-contrast scenes.

Linux Support Available Now

Last month, NVIDIA added DLSS support for Vulkan API games on Proton, enabling Linux gamers to boost frame rates on Proton-supported titles such as DOOM Eternal, No Man’s Sky, and Wolfenstein: Youngblood. Today, the NVIDIA DLSS SDK is adding support for games running natively on Linux with x86. We are also announcing DLSS support for ARM-based platforms.

Easier Access for Developers

Accessing the DLSS SDK is now easier than ever — no application required! Simply download the DLSS SDK 2.2.1 directly from the NVIDIA Developer website, access the Unreal Engine 5 and 4.26 plugin from the marketplace, or utilize DLSS natively in Unity 2021.2 beta.
https://videocardz.com/press-releas...ailable-for-all-developers-with-linux-support
 
It’s hard to believe developers still haven’t figured out they shouldn’t base LOD on internal render resolution when upscaling.

TAA: higher LOD, less fine detail, more blur, less jaggies
DLSS: lower LOD, more fine detail, less blur, more jaggies

Take your pick. What a shit show.
Considering that there are still engines which have resolution dependant DoF and such, I do not find this surprising.
"A new sharpening slider allows users to make an image sharper or softer based on their own personal preferences."

Presumably this isn't an input into the DL model but a post process sharpen. So more of a convenience for something we can already do today.
Which is quite nice if it means that we can finally get rid of the common over sharpening.
 
From the DLSS programming guide:

If ExposureValue is missing or DLSS does not receive a correct value (which can vary based on eye adaptation in some game engines), DLSS may produce a poor reconstruction of the high-resolution
frame with artifacts such as:

1. Ghosting of moving objects.
2. Blurriness, banding, or pixilation of the final frame or it being too dark or too bright.
3. Aliasing especially of moving objects.
4. Exposure lag.

And NVIDIA added Auto Exposure feature in DLSS 2.2:

Using the main exposure parameter(see above) is the preferred method, however in some situations, taking advantage of the optional auto-exposure feature in the DLSS library can provide improved results.
To use the auto-exposure values computed inside the DLSS library instead of pInExposureTexture, the developer must:

1. Set the NVSDK_NGX_DLSS_Feature_Flags_AutoExposure parameter during DLSS Feature Creation

NOTE: Depending on the version of the DLSS algorithm, GPU and output resolution, there may be a small increase in processing time (~0.02ms) when auto-exposure is enabled.

This explains why the image quality has improved in DLSS 2.2.
 
NVIDIA DLSS Overview & Game Integrations presentation at was nice.

They went over the basic gist of how DLSS 2.0+ is trained and how the training works in the context of the real time stuff happening on the GPU in the game with temporal samples and motion vectors. They talk about what a game needs for integration in straightforward steps. They then stress heavily through the rest of the presentation common mistakes they see in existing DLSS implementations that devs need to look out for and correct. They state that devs should be more thorough than just content that DLSS "works" after first integration, and actually do a:b comparison screenshots for various aspects of rendering to make sure they actually are doing the integration correctly. "DLSS should by design look extremely close to native, not just OK with discrepancies"

0. Making sure jitter matrix is proper with their screen debugger.
1. texture negative LOD bias not taking into account different internal resolutions from output ones
2. LOD calculations for Decals, Geo, etc. not taking into account different internal res from output res
3. Post-processing not being res scale aware, causing different aperture size or pixel size coverage for depth of field, chromatic aberration, etc.
4. Motion Vectors not being properly done or resolving incorrectly (they have a debug for this)

They mention at the end how DLSS is in a stage of active research still, with improveents being looked at for future versions:
1. Performance improvements at same quality
2. Visual stability and AA improvements for disocclusions or RT Reflections even
3. DLSS awareness and improved reconstruction for elements of the scene that lack motion vectors, like most particles for example

NV seems generally really aware of what makes a good DLSS implementation and which ones are lacking... which is reassuring instead of just slapping it poorly into a game and calling it a day.
 
Last edited:
The DLSS SDK released by Nvidia allows you to view and adjust debug values for DLSS in DLSS 2.X games : nvidia (reddit.com)
One of the interesting things that came from it was the fact Nvidia released a "development" version of the nvngx_dlss.dll and a tool you can use to activate the debug overlay and shortcuts for DLSS.
With the debug tools and overlay you can change certain settings about DLSS, whether or not it uses sharpening, auto-exposure, and you can adjust the jitter pattern of the game. Along with this, you can also view some debug information from the game. Motion vectors, depth buffers, the rendered image prior to being processed by DLSS, etc.

You can also turn on jitter accumulation debugging which results in some interesting visuals (The YouTube video linked below has an example about half way through).
This can be useful for tweaking the visuals of DLSS to your liking (although, it's rather limited) or simply to experiment with stuff, or to debug some issues. For example, there is a ghosting trail on the floating rocks in Death Stranding when DLSS is enabled with the default implementation. It was believed that the cause for this was the rocks lacking motion vectors, and looking at the debug information, this is true.

I also took a look at Control. If I replace the DLL to use DLSS 2.2.11, it looks like DLSS is broken. Using the debug information from this version of DLSS allowed me to find out that the cause for the broken looking DLSS is that Control isn't jittering the pixels, something that might of been broken when I added DLSS 2.2.11 to Control?

Anyway, I just thought I'd share this. To see an example of me playing around with some of these settings, checkout this video

 
Escape from Naraka Out Today, Features RTXGI and NVIDIA DLSS Support (2x Performance at 4K) (wccftech.com)
Escape from Naraka, not to be confused with NARAKA: BLADEPOINT, is a first-person action platformer made by the very small team at XeloGames located in Indonesia.

It's out today on Steam, priced at €13.49 thanks to the 10% launch discount (though Fanatical is selling it at 20% off, so you might want to purchase it there instead), and it's also the latest PC game to support both ray tracing (including RTX Global Illumination, raytraced shadows, and reflections) and NVIDIA DLSS.

Yosua Bayu S, Developer at XeloGames, also discussed the importance of NVIDIA DLSS in the game:
"Adding NVIDIA DLSS to the game was fast and easy with the UE4 plugin, providing our players maximum performance as they take on all the challenges Escape From Naraka has to offer."
 
Nvidia's DLSS magic finally makes Red Dead Redemption 2 playable at 8K on an RTX 3090 | TechRadar
August 2, 2021
So, we were keen to see what a difference DLSS makes, especially at the ultra-high 8K resolution. As usual, we played the game using our 8K gaming PC provided by Chillbast, along with Dell's UltraSharp UP3218K monitor. Check out the boxout on the right for the full spec of the PC we used.

After hitting 25.3fps at 8K with all graphical settings set to their highest levels but DLSS off, we kept everything the same, but turned DLSS to 'Auto'. The results were drastic, with it now hitting 50fps on average – effectively doubling the frame rate.

While still not hitting 60fps on average, it instantly makes the game actually playable at 8K, something it wasn't without DLSS, and it's a potent example of how much of a difference DLSS can make.

The minimum fps also rose from 21.5 to 29.7 fps, while the max frame rates doubled from 31.3 to 62.2 fps. This leaps shows that with a bit of tweaking we should be able to get to 60fps on average at 8K – which was once considered unobtainable.

Visual-wise, it looked fantastic, if perhaps a little softer in distant details. Of course, Red Dead Redemption 2, like most games, hasn't been designed for 8K, so textures aren't going to benefit from the high resolution, but overall, DLSS has shown just what it's capable of.
 
Great. Now both guys playing at 8k can start with rdr2

I wonder how good it would look downscaled to 4K via DSR though, i.e. if it looks better than native 4K while still being more performant than whatever native res is equivalent then it'd be worthwhile for a wider audience. An image quality and performance analysis of DLSSQ 4k, DLSSUQ 8K downscaled to 4K and native 4K would be pretty interesting.
 
I wonder how good it would look downscaled to 4K via DSR though, i.e. if it looks better than native 4K while still being more performant than whatever native res is equivalent then it'd be worthwhile for a wider audience. An image quality and performance analysis of DLSSQ 4k, DLSSUQ 8K downscaled to 4K and native 4K would be pretty interesting.

You mean DLSS ultra performance at 8K right? That's 1440p internal resolution which is less than the 1800p that DLSSQ 4K uses. Might be an uphill battle.
 
You mean DLSS ultra performance at 8K right? That's 1440p internal resolution which is less than the 1800p that DLSSQ 4K uses. Might be an uphill battle.

Sorry yes I did mean ultra performance (at 8k). DLSS quality at 4k is also 1440p though so its the same internal res for each.
 
August 14, 2021
Experience the ultimate sci-fi horror experience in Chernobylite with NVIDIA DLSS for up to 80% faster performance. You take on the role of a physicist, a former employee of the Chernobyl nuclear power plant, and research the mysterious disappearance of your beloved. Get ready for an exciting adventure full of survival, conspiracy, horror, love and obsession in Chernobylite
 
Back
Top