Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

Why would nvidia release dlss source(training and unencrypted documented network)? [removed attempt at diversion]. On the other hand there is all incentive in the world to release plugin allowing anyone to use proprietary dlss2.0 implementation in unreal4.26.
[...]

You linked a tech talk slide, so you had to have read it. Using an API =/= source code for whatever is behind that API.

The GPL license doesn't allow to distrbute closed source dlls with the game.
NVIDIA has added TAAU for the game, though.

Exactly. Should have been obvious that DLSS implementation in a game engine (the topic of the post I quoted from Dampf) was the issue at fault here. IMO, Nvidia has to get ahead of half baked DLSS work based off of cargo culting the CDPR leaks, since before that, it was more or less (mostly less, IMO) Nvidia-invite-only to use it in your project.
 
Should have been obvious that DLSS implementation in a game engine (the topic of the post I quoted from Dampf) was the issue at fault here. IMO, Nvidia has to get ahead of half baked DLSS work based off of cargo culting the CDPR leaks
That's a GPL license issue, it has nothing to do with DLSS technical implementation and obviously CDPR leaks have nothing to do with the DLSS plugin release, do you imagine them making this plugin in a week and just because someone's source has leaked?
Of course that's not the case, I can only suggest you rereading manux's posts.
 
You linked a tech talk slide, so you had to have read it. Using an API =/= source code for whatever is behind that API.



Exactly. Should have been obvious that DLSS implementation in a game engine (the topic of the post I quoted from Dampf) was the issue at fault here. IMO, Nvidia has to get ahead of half baked DLSS work based off of cargo culting the CDPR leaks, since before that, it was more or less (mostly less, IMO) Nvidia-invite-only to use it in your project.

I'm interested in technology. This recent discussion was about did cdpr2077 leak lead to publishing unreal plugin for all. No, it did not.

Nvidia/closed source/... bad and other non technical stuff has been gone over many times before. I'm not participating into that as that is waste of time. No conclusion will be found and haters and fanboys will emerge. Those things should be in another non technical thread.
 
Theres also tensor/DLSS hardware involved on the GPUs themselfs.

If one reverse engineered the neural net it would be possible to run and optimize version for shader cores. Where that would lead, who knows. Likely slower than tensor cores but how much slower is another question.

Great thing about unreal plugin is that amd can use same API's and to create their own super resolution plugin. If amd is smart whatever they do will work with all unreal4.26 titles using same plugin mechanism and API's that nvidia now uses. This could be one reason why it took time to make plugin. Perhaps epic consulted intel and amd also on how to expose the necessary data to plugin doing scaling before committing to specific approach.
 
Theres also tensor/DLSS hardware involved on the GPUs themselfs.
Yes and no.
Yes, as in current version of DLSS is running on tensor cores.
No, as in nothing would prevent one running the same code on CUDA cores or AMDs Compute Units for that matter (or rather, same calculations, they could use some internal API which obviously wouldn't run on AMD hardware as it is).
 
Yes and no.
Yes, as in current version of DLSS is running on tensor cores.
No, as in nothing would prevent one running the same code on CUDA cores or AMDs Compute Units for that matter (or rather, same calculations, they could use some internal API which obviously wouldn't run on AMD hardware as it is).

Not disputing that. But i can imagine DLSS being the most performant on the dedicated hardware for a while to come. Obviously using CU's/compute does eat performance there.
I am on the other hand welcoming the idea of seeing DLSS across AMD, Intel GPUs which is a good thing for everyone, even though it will most likely be the most performant etc on NV's hardware.
 
No, as in nothing would prevent one running the same code on CUDA cores or AMDs Compute Units for that matter (or rather, same calculations, they could use some internal API which obviously wouldn't run on AMD hardware as it is).
This is a bit like saying that nothing prevents you from running Minecraft RTX on a GTX9800.
 
This is a bit like saying that nothing prevents you from running Minecraft RTX on a GTX9800.
That's a bit extreme of an example :) Control showed DLSS 2 performance on compute was feasible but obviously somewhat faster on Tensors.
 
NVIDIA DLSS for Unreal Engine (pugetsystems.com)

Written on February 17, 2021 by Kelly Shipman
Up until now, to get access to DLSS in Unreal Engine, you needed to get approval from NVIDIA and download a custom branch of Unreal that had DLSS enabled. Even then, the DLSS features would only show in a packaged product, as in a game, not in the editor. Now, NVIDIA has released DLSS as a free, publicly available plugin for Unreal Engine. So far, it has been working beautifully.
,,,
To test DLSS, I used two of the free Megascans projects from the Unreal Marketplace. Specifically the Abandoned Apartment and Goddess Temple. Both of these projects are pretty demanding, especially at 4K. I recorded the FPS at 1080p, 1440p, and 4k by selecting Play > New Editor Window. You’ll see similar performance improvements in the regular viewport window.
...
Visual quality does take a minor hit, especially at “Ultra Performance”. How much of a hit will depend on your specific project. Because this is rendering at a lower resolution and then upscaling, fine lines will see the most noticeable impact, and will probably want to run in the Quality setting. However, if you are running a LED wall, that isn’t highly detailed or in focus, you might be able to use a more aggressive Performance mode. The included documentation includes lots of details to tweak various image quality settings. You’ll be able to dial in the right settings for your needs.
pic_disp.php
 
Did Epic rename TAAU to TXAA?
No, TXAA and TAAU are different techniques.
TXAA, for all I could find, refers to Unreal Engines Temporal AA without upsampling compontent, while TAAU is Temporal AA with upscaling
In addition to Primary Spatial Upscale, a second upscaling technique is also supported for the primary screen percentage: Temporal Upsample. Instead of performing temporal integration with the Temporal Anti-Aliasing (TAA) and then doing a primary spatial upscale, both happen at the same time in the Temporal Anti-Aliasing Upsample (TAAU) shader.
 
TXAA, for all I could find, refers to Unreal Engines Temporal AA without upsampling compontent, while TAAU is Temporal AA with upscaling
TXAA is originally a technique developed by NVIDIA that has TAA + MSAA components, it preceded TAA during the Kepler architecture era, it had 2X, 4X and 8X levels, and was featured in games such as GTA V, Assassin's Unity, and Crysis 3.
 
Back
Top