Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

Yes DLAA clears up the shimmering on the tree shadows and steps quite nicely but there's a definite drop in sharpness which there are obviously ways around.
 
4K or 1080p are meaningless things on PC. Texture MIP LODs must be adjusted according to the output resolution when running DLSS, not rendering resolution.
Thanks just wanted to confirm. But IIRC texture sizes do matter for LOD levels. If you have a 4096x4096 texture you have more LOD levels than 512x512. So isn’t this somewhat related ? I guess related to texture resolution and not so much rendering resolution ?
 
Thanks just wanted to confirm. But IIRC texture sizes do matter for LOD levels. If you have a 4096x4096 texture you have more LOD levels than 512x512. So isn’t this somewhat related ? I guess related to texture resolution and not so much rendering resolution ?
Texture LODs are generated automatically. It's only a question of setting up the renderer to do LOD bias correction according to output (display) resolution instead of rendering resolution. Even if your textures are 16x16 you will still have visible surfaces where a lower MIP bias would produce a sharper image on a display with more physical pixels.
 
Just tried Avengers on Game Pass PC and it doesn't have the classic quality modes anymore but only "off" and "dynamic". The cool thing is dynamic will do a static 100% input resolution if exceeding target framerate. So I'm seeing 1440p->1440p DLSS here. :oops:

Must be a recent change as just a couple months back people were testing this game vs FSR by going thru the different quality modes.
 
Game Developer Conference Sessions | GTC Nov 2021 | NVIDIA

Sebastian Tafuri
, Senior Rendering Engineer, Frostbite
DLSS upscaling has become a very attractive way to improve the performance of games. While integrating and using the SDK itself is straightforward, there are some common challenges with a sophisticated engine like Frostbite. These include correct handling of motion-vectors, particle effects, half-resolution effects, and texture quality. It's also important to address the depth buffer, which is currently not upscaled with DLSS. We'll discuss how we have integrated DLSS with the latest BattleField installment, the challenges we encountered, and the solutions learned.
 
Oh I found the old quality modes too. Dynamic resolution which is on by default, was blocking them from appearing. So maybe not a recent change after all and "DLAA" has been in the game longer, dunno.
Avengers always had this DLSS option AFAIR.
But it seems to be just switching between the same old DLSS presets based on current framerate so it's not a true "dynamic resolution" solution.
 
But it seems to be just switching between the same old DLSS presets based on current framerate so it's not a true "dynamic resolution" solution.
I'm actually seeing resolutions between 67% and 100% when entering a suitable framerate target.

AV1_x64xpa_gm_2021_10_01_18_25_33_362.png


The 100% input res is also definitely working (by entering for example 30fps as framerate target). Onscreen indicator shows 1440p->1440p, framerate is a bit lower than with native + TAA and image quality is better than with quality mode.
 
Valve Enables Experimental Nvidia DLSS Support For DirectX 12 in Proton | Tom's Hardware (tomshardware.com)
October 1, 2021
Valve, the company behind the highly-anticipated Steam Deck handheld console, has posted an update (via Phoronix) for both the standard and experimental Proton compatibility layers that allow Windows games to run on Linux. The experimental version supports Nvidia Deep Learning Super Sampling (DLSS) in DirectX 12 games, while the regular version broadens support to more game titles.
...
Previously, Valve only used the DLSS function in games that ran on the Vulkan API. However, Valve's experimental support now allows DirectX 12 games to run DLSS without a problem, and it will arrive in the stable Proton branch after further testing.
...
To get this to run, you'll need to compile the latest version of the Proton Experimental branch, install the latest Nvidia drivers, and set the "PROTON_ENABLE_NVAPI=1" environment variable.
 
It's DLSS or both. It doesn't add DLSS to anything, just allows you to use it on Linux on DX12 games that support it already on Windows

Oh. I thought it was adding it on everything. So it's just enabling whatever the game would have options for if run on Windows natively.
 
Although DLSS is a valid option for RTX card users in Alan Wake Remastered, potential is left behind. First, Remedy and d3t (like many developers before) seem not to adhere to Nvidia's DLSS guidelines with regard to the texture LOD determination: In Alan Wake Remastered, DLSS reduces the texture quality, especially for removed texture objects (such as trees) speaks for a lack of adaptation. Nvidia stipulates that DLSS works with the texture LOD of the target resolution - in fact, however, the MIP maps of the internal resolution are used in most games. Second, the question of the sense arises: According to PCGH tests, Alan Wake Remastered runs very smoothly even in Ultra HD - we deliver benchmarks afterwards - so that the relieving effect of DLSS is not required. From our point of view, Nvidia would be well advised Enable Deep Learning Anti-Aliasing (DLAA) in all DLSS games.

https://www.pcgameshardware.de/Alan...Alan-Wake-Remastered-Benchmarks-Test-1380758/
 
Last edited:
Back
Top