Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

KeokeN on NVIDIA DLSS Advantages over TAA; Console Port of Deliver Us The Moon - Fortuna
We talked with Koen Deetman, Founder and Game Director at KeoKen Interactive, who explained the advantage of using DLSS over simply combining a lower native resolution with Temporal Anti-Aliasing (TAA). He also provided an update on the status of the console port of Deliver Us The Moon – Fortuna.

Can you quantify the performance improvements players will see after activating NVIDIA DLSS at 4K resolution? How does it fare in terms of image quality compared to the normal, non-DLSS version of Deliver Us The Moon: Fortuna?
Upscaling to 4K with DLSS will give better image quality than what was possible before with the same performance budget of running the game at about 1440p. Running the game in native 4K will be the only way to outperform the DLSS upscaling.

Based on the few tech demos available, some have suggested that NVIDIA DLSS delivers similar performance and image quality to simply running the game at 1800p resolution with TAA. Do you agree?
It will be close in performance, but comparing image quality, more detail is visible with DLSS. The main takeaway is that TAA introduces ghosting artifacts, while DLSS does not.
https://wccftech.com/keoken-nvidia-dlss-advantages-over-taa/
 
Is it not true that DLSS needs to be implemented in the game engine itself?
I'm curious as to why it's not something that can be added to Nvidia's GUI via drivers?
From what I understand you need to built it into the pipeline like other reconstruction techniques.
 
http://benchmark.finalfantasyxv.com/na/

NVIDIA DLSS
This application allows you to check your PC's compatibility with the latest graphical effects provided by NVIDIA.

  • * NVIDIA GeForce RTX 2070, 2080 or 2080 Ti is required to run NVIDIA® DLSS.
  • * NVIDIA® Ansel is disabled when NVIDIA® DLSS is enabled.
  • * In order to enable NVIDIA® DLSS you need to set display resolution to 4K.

Edit: You have to go into "Custom" to enable DLSS, and that automatically disables regular AA.

capture-001-14112018-230856.jpg
hXRyQyyB
 
Last edited:
I got a score of 4972 ("Fairly High") with those settings on my RTX 2080, using the "Curve" oveclock, plus 700 MHz on the memory. The benchmark crashed a few times before that, maybe because of the more aggressive overclock I use in every other game, or maybe because of Afterburner's overlay. So I disabled the overlay, and went back to the original overclock nVidia allows you to discover (Its curve) using its utility.
i7 6700, non-k.

Demo looks nice enough, though it seemed pretty casual. The antialiasing was top notch, though I noticed a brief amount around the rear window of the car in the very beginning of the video. I've seen worse in my games at 4K, and with other other forms of AA on top of that, so good job here, imo. I was frequently hovering below 50 fps but the motion looked very smooth. Though there was some hitching, very briefly, here and there.

I installed the game to a SSD (480 GB Mushkin Striker) that isn't the one my OS is on, fwiw. Even so, I think there was a bit of pop-in, here and there, though it wasn't distracting.

We can debate the budgeting of transistors towards use as tensor cores, but seeing as they're there, I judge the use of them as a positive, and not a negative, or "a push"/being of no benefit.

I guess I should compare by looking at the game using TAA and 3200 x 1800, but I'm already pleased by the lack of aliasing that I've seen.

Edit: I got a score of 6178 at 2560 x 1440 using TAA, 3200 x 1800 wasn't available, though I do have that custom resolution enabled, and I even tested it just now in Windows. I ran the demo twice with Afterburner's Rivatuner overlay enabled and it didn't crash, so maybe it was my overclock, or DLSS still having a bug or two, that caused it previously. The pop-in was mostly shadows, what there was of it, btw.

Hard to compare the IQ when 2560 x 1440 gives a better framerate, but I'm guessing that even against 3200 x 1800 using TAA, DLSS will get the win. It's sharp, and has good clarity and saturation.

I just noticed the "Assets" setting. I didn't click that on while setting everything else to the highest setting.

OK, so I tested again with that on, and got 6106 at 2560 x 1440, and TAA.
 
Last edited by a moderator:
Battlefield V and Anthem to officially receive DLSS support, more DXR optimizations are coming for BFV too.

Electronic Arts and DICE will shortly release an update to Battlefield V that will incorporate DLSS support, as well as additional optimizations for real-time ray tracing. Pairing DLSS with ray tracing allows gamers to get both amazing performance and ray-traced image quality.

Testing with early builds shows that RTX performance with DLSS and ray tracing simultaneously enabled can provide comparable frame rates to playing with ray tracing disabled.
https://www.nvidia.com/en-us/geforce/news/geforce-rtx-anthem-battlefield-v-bundle/
 
Last edited:
Atomic Hearts will implement DLSS along side RT, DLSS requires the generation of motion vector in the game engine and some buffers as well.

As we are using UE4, so many of the requirements for DLSS, such as the generation of motion vectors and that the UI can run at a different screen size to the rest of the game were already supported by the game engine. That simplified things for us. Then we just had to tag the right buffers and provide a training build to NVIDIA for data verification and network training. NVIDIA creates all the neural network training and supplies this back to us. This is generated on supercomputers at NVIDIA HQ.
https://wccftech.com/atomic-heart-qa-nvidia-rtx-dlss-pvp/
 
DLSS now been added to Port Royal with performance improvement.
And is largely irrelevant for games. It's the best-case scenario for the technology. Especially in the canned benchmark, how is it useful to anyone? Benchmarking DLSS performance differences between RTX models?

Edit: Don't really mean to come across so negative. I'm not against the tech and in some cases like this it's an improvement both in visual quality and performance. There's just not enough useful working examples to really call it a great feature.
 
Last edited:
And is largely irrelevant for games. It's the best-case scenario for the technology. Especially in the canned benchmark, how is it useful to anyone? Benchmarking DLSS performance differences between RTX models?

Edit: Don't really mean to come across so negative. I'm not against the tech and in some cases like this it's an improvement both in visual quality and performance. There's just not enough useful working examples to really call it a great feature.
Only thing this really shows is how bad Port Royal's TAA-solution is, I mean did even quincunx blur that much?
I just thought they decided to go for blurrier look for some artistic reason but apparently not
 
Only thing this really shows is how bad Port Royal's TAA-solution is, I mean did even quincunx blur that much?
Yeah their TAA is quite bad but I'm not sure anything could be as terrible as Quincunx lol. TAA is just bad anyway, it's never an option I choose to go with unless it's the only other option than noAA or FXAA. Unfortunately more and more titles nowadays are implementing very limited AA solutions and going with post-AA only.

I said when DLSS was announced that TAA isn't exactly a good starting point to base it on but if it works out well then it's certainly better than just TAA option.

Are we really that removed from multi-sampling solutions worked into the engine rendering pipeline and having great options like TXAA? Has the general move to only certain engines en masse meant we're no longer going to have such things on PC?
 
Are people grouping reconstruction techniques under TAA these days, or are other reconstruction methods just not being considered in these DLAA comparisons?
 
Are people grouping reconstruction techniques under TAA these days, or are other reconstruction methods just not being considered in these DLAA comparisons?
Personally I think the latter.

Have we had any good upsampling temporal reconstruction AA implementations on the PC yet? PC games don't usually render at a lower resolution and upsample. Maybe Forza Horizons?
 
Personally I think the latter.

Have we had any good upsampling temporal reconstruction AA implementations on the PC yet? PC games don't usually render at a lower resolution and upsample. Maybe Forza Horizons?
I don't know how good or bad it qualifies but Quantum Break defaults to upsampling temporal reconstruction on PC too
 
Back
Top