Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

I think this must be true. What's also important to note is that it's the faint detail where the breakup/loss of quality occurs (most visibly at least) when the camera moves.
DLSS manages to draw those red lights much more accurately from far smaller input resolution which is quite a feat - without AA some and with TAA most of them are missing.
...
Start moving the camera left and right and the frames become less alike and using them as effectively becomes impossible. As a result these red lights will start to break up/turn off here and there.
Is it all red lights, red lights closest or furthest away that break up? I think I noticed a similar effect but dismissed it as a rendering quirk though might be related to @OlegSH 's motion blur comment above.
 
Is it all red lights, red lights closest or furthest away that break up? I think I noticed a similar effect but dismissed it as a rendering quirk though might be related to @OlegSH 's motion blur comment above.
Sold my Turing already so can't really check.

Half memory, half assumption: the smaller the light the more it will "flicker" in movement. Those red dot lights will turn on/off, the red lines will lose some of their length and regain it. It's the stuff reconstructed from subpixel data that suffers most. If the line is drawn well with TAA/no AA, DLSS won't have many issues with it.
 
3dmark gets DLSS 2.0.

We're thrilled to announce that we've added DLSS 2 support to the NVIDIA DLSS feature test to help you test and compare performance and image quality with and without DLSS processing on existing NVIDIA RTX graphics cards and the new GeForce RTX 30-Series GPUs.

You can now choose to run the NVIDIA DLSS feature test using DLSS 2 or DLSS 1.
 
Sounds like async dlss will give an additional 5-10% performance increase when it's supported. Some reviewers had a Wolfenstein Youngblood beta that supports it.

In theory it will depend on card tier and performance target. It could be much more for high refresh rates. Remember the 2080 Ti takes 1.5ms for 4k DLSS to run and the 2060 takes 2.5ms, and we can probably assume* similar runtimes for similarly fast Ampere cards. At 60 fps (16ms frametime) that's indeed less than 10% on the 2080 Ti, but 15% for the 2060. If targeting 144Hz (7ms frametime) on the other hand it's 21% and 35% increase respectively.

* The tensor flops with sparsity increase from Turing is equal to FMA increase, 2x per SM. Neither are likely to reach the 2x performance in practice, the assumtion being that the per SM performance gain will be similar both for rasterisation and DLSS.
 
In theory it will depend on card tier and performance target. It could be much more for high refresh rates. Remember the 2080 Ti takes 1.5ms for 4k DLSS to run and the 2060 takes 2.5ms, and we can probably assume* similar runtimes for similarly fast Ampere cards. At 60 fps (16ms frametime) that's indeed less than 10% on the 2080 Ti, but 15% for the 2060. If targeting 144Hz (7ms frametime) on the other hand it's 21% and 35% increase respectively.

* The tensor flops with sparsity increase from Turing is equal to FMA increase, 2x per SM. Neither are likely to reach the 2x performance in practice, the assumtion being that the per SM performance gain will be similar both for rasterisation and DLSS.

Presumably if games take advantage of Amperes parallel DLSS/CUDA execution capability then there's no impact at all on the main rendering time.
 
Why would 3dmark bother having DLSS 1.0 available in the benchmark?

It was probably easy to do since it was already there and is useful for curious folks who want to compare the 2 implementations. I doubt any reviewers will bother though.
 
I wish they'd just unlock higher base resolutions. They already work up to 93,3% in Control which would suggest the 67% max is just an arbitrary limit that can be easily removed.

I'll be pretty disappointed if we're stuck with 67% max on Ampere. :cry:
Looks like my wish is coming true. From NvRTX/UnrealEngine github repo:

Setting DLSS Quality
  • r.NGX.DLSS.Quality 0...4
    • 0 Performance
    • 1 Balanced
    • 2 Quality
    • 3 Ultra Performance
    • 4 Ultra Quality
:yes:
 
I wonder if ultra quality will be like supersampling. I only have a 1080p144 monitor right now and a 3080 would pretty much be overkill if I cap games for gsync. Supersampling is a great option for that kind of setup.
 
Ultra Performance is render at 1440p upscale to 8K right? Maybe Ultra Quality is render at 4K upscale to 8K.
Ultra Performance is 33% yes, not sure if it's limited to 8K use only. 4K to 8K would be 50% resolution scale aka "Performance" unless they've changed the naming scheme. I would assume Ultra Quality is something between 67-100%. There's definitely a reason for this to exist!
 
Ultra Performance is 33% yes, not sure if it's limited to 8K use only. 4K to 8K would be 50% resolution scale aka "Performance" unless they've changed the naming scheme. I would assume Ultra Quality is something between 67-100%. There's definitely a reason for this to exist!

What’s Balanced?

Ultra Performance = 33% (1440p -> 8K)
Performance = 50% (1080p -> 4K)
Balanced = ???
Quality = 67% (1440p -> 4K)
 
Back
Top