Nvidia DLSS 3 antialiasing discussion

I find the latency conversation around DLSS pretty weird, to be honest. I've never seen any of the youtuber reviewers care about latency before. I've never seen them compare latency when evaluating gpus in reviews. They have never considered latency as part of performance when reviewing, just average fps, 1% low and 0.1% low. This is a purely hypothetical scenario but say there's a game x and they're benchmarking a 3080 vs a 6800. Would you ever hear them say, the 6800 has 100 fps, and the 3080 has 90 fps, but the 3080 has nvidia reflex leading to 10ms lower latency, therefor it is performing better and receives our recommendation? Have you ever heard latency brought up in terms of relative performance of games? When people are talking about how well "optimized" games are, do they ever measure latency? The general perception is if it scales with hardware, and if the fps numbers are high, then it's "optimized" but you never see them measure latency. Some games have a lot more latency than others at any given frame rate.

Many people would be shocked to find out that the click latency, or the motion latency of mice can vary by as much as 20 ms, even with mice that are marketed as being for gaming. People switch mice all of the time and never notice. I think a g pro superlight is close to if not the best performer. If you show them a chart that shows the relative latency of peripherals suddenly they start replacing them, but if they've been gaming a long time they've probably switched from low latency to higher latency peripherals without knowing or caring. It's kind of where showing charts comparing latency with frame generation on or off can be tough, because you can see the difference between the numbers, but unless you use it you won't know if you can actually feel it. I'm not saying they can't, but it'll vary by person. On top of that you can lower latency on mice by increasing cpi.

I am particularly latency sensitive. I might actually notice the differences in frame generation, and not be willing to accept it. I'm very accustomed to gaming at 100+ fps, usually closer to 200 fps, and that was true even when I had mid range cards like the gtx 1060. What did I do? I lowered the settings. The idea that DLSS3 might not be viable on a 4050 or 4060 is weird to me, because you just lower the settings to hit 60fps and then add the frame generation. Some people would play at ultra settings on those gpus, some would not. It's just another tool for people to take advantage of.

Even being sensitive to latency there is probably a point where I stop being able to tell differences. I think it's probably around the 120 fps mark vs anything higher in the same game. I can definitely tell when my gpu hits 100% and latency starts piling up, even at high frame rates. That's why I'll frame limit all of my games if they don't support nvidia reflex.

One thing I need to see is comparisons of DLSS3 at 1440p and 1080p. Those are still the most common resolutions. Hopefully someone will test it when the 1080p 500Hz displays are launched, as well as the 1440p 360Hz displays.

Edit: One thing I'll add is they're weirdly making the argument that AMD gpus are vastly inferior. On the input latency graphs they show native vs native (reflex off). The reflex off is probably around where the AMD gpus would sit, assuming frame rates are similar, because they don't have a comparable technology. But they care so much that they've never made that comparison before. It's weird.

Edit: Some data on the difference in latency between similar class Nvidia, AMD gpus. I miss Battlenonsense's content.

 
Last edited:
Even being sensitive to latency there is probably a point where I stop being able to tell differences. I think it's probably around the 120 fps mark vs anything higher in the same game. I can definitely tell when my gpu hits 100% and latency starts piling up, even at high frame rates. That's why I'll frame limit all of my games if they don't support nvidia reflex.
Yeah, my own experiments show that I kinda stop feeling the input difference above 90 fps in general. So if a game can hit 90 fps and I can add FG on top of that to get "180" fps then I honestly don't think that latency in particular will be of any concern to me. IQ of generated frames though that's a different story.
 
Yeah, my own experiments show that I kinda stop feeling the input difference above 90 fps in general. So if a game can hit 90 fps and I can add FG on top of that to get "180" fps then I honestly don't think that latency in particular will be of any concern to me. IQ of generated frames though that's a different story.

Yah, I think I'd like to see real full-speed 120 fps footage (at least) to get a better idea. Not sure why 4k has been the issue here. There should be capture cards that do 1440p 120Hz/144Hz. I'd also just like to see 1440p comparisons. I'm assuming, like DLSS2, frame generation may look worse at lower resolutions.
 
So basically, you could either play at 70 fps that feels like 70fps without DLSS3 or play at 110fps that feels (input latency wise) like 45fps with dlss3, hmm...
 
This "real price tag" would likely prevent DLSS3 adoption in games making it essentially useless. There is a reason why GPU features are "free".
So everyone has to pay for it? That's not what "a nice bonus" is. Because make no mistake, you're paying for the many many hours of software engineering as well.
 
So basically, you could either play at 70 fps that feels like 70fps without DLSS3 or play at 110fps that feels (input latency wise) like 45fps with dlss3, hmm...
No? You could either play at 70 fps which feels like 70 fps without DLSS3 or at 140 fps which feels like 70 fps with it.

So everyone has to pay for it?
Yes, because this is how new features are getting promoted and used.
And do you have any basis on saying that making it pay walled would cut off $200 from the 4090 price? What if it's actually $20?
 
No? You could either play at 70 fps which feels like 70 fps without DLSS3 or at 140 fps which feels like 70 fps with it.


Yes, because this is how new features are getting promoted and used.
And do you have any basis on saying that making it pay walled would cut off $200 from the 4090 price? What if it's actually $20?
Please re-read, what I wrote: "Both under the premise of the card being a bit cheaper, say, 200$ less". Notice the "say"? So please don't put words in my mouth I did not say ("would cut off") - thx!
 
For example, a latency increase from 50ms to 60ms is just 20% of added latency, but same 10ms added latency increase from 20ms to 30ms, that's 50% more latency. Would you notice 20% increase? maybe not, would you notice 50% increase, maybe. It makes the dlss3 analysis kinda subjective, it depends on the type of game, the base frame rate, are you playing with a mouse or game pad, what is the monitor refresh rate, are you sitting close or far etc etc.. can there be "the end all be all" verdict when it comes to dlss3? Idk
 
Please re-read, what I wrote: "Both under the premise of the card being a bit cheaper, say, 200$ less". Notice the "say"? So please don't put words in my mouth I did not say ("would cut off") - thx!
Hence why I'm asking what would you say if it wasn't, say (sic), $200 and was, say, $20? (Which seems like a much more realistic if not still a bit high valuation of DLSS3 development cost inside Ada's overall "BOM").
 
Edit: One thing I'll add is they're weirdly making the argument that AMD gpus are vastly inferior. On the input latency graphs they show native vs native (reflex off). The reflex off is probably around where the AMD gpus would sit, assuming frame rates are similar, because they don't have a comparable technology. But they care so much that they've never made that comparison before. It's weird.
Exactly! They never ever made a point that NVIDIA GPUs have this feature that reduces latency in dozens of games, a feature that is not available to AMD GPUs at all.

Furthermore, the inclusion of DLSS3 will accelerate the adoption of the Reflex feature immensely, which will increase the advantage of NVIDIA GPUs, yet no body will talk about this, uncless in the context of DLSS3.

Instead we have stuff like this.

 
DLSS3 definitely adds latency as seen from both DF and Hub video, so it would definitely not feel like 70fps with the added latency, more like 50fps.
The input latency is not the equivalent of 1/2 the output frame-rate and then worse, it is only that way if you are at VSync.

It is taking a while I think for people to understand exactly what the frame being held on to is all about. I think about it like this: DLSS Frame-Generation's frame-rate is exactly 1 frame (so the frame-time of that frame of latency) + the processing cost of DLSS 3's frame generation. The cost of DLSS3's frame-generation should be around 3 milliseconds on the FF unit + ML program for example if we follow what Nvidia said. That 3 millisecond cost affects the END frame-rate, so it is rarely actually close to double the frame-rate. Therefore real cost latency it is inducing is also not just double that of half the frame-rate. As an example in a theorietical scenario:

100 fps output frame-rate with DLSS3 Frame gen on = 10 milliseconds of cadence per frame. But!
10 milliseconds per frame is including the cost of DLSS 3 run time. The internal run time of the traditional frame is of course higher but not double!

For this reason if you are GPU limited and turn on DLSS 3 it does not offer exactly 2x the average frame-rate, due to the processing cost. For that same reason though it is not exactly 2x the input latency since the time it takes to internally render a frame is not actually 2x the milliseconds.

edit: I typed a dumb thing lol. Note to self, do not drink coffee
 
Last edited:
Hence why I'm asking what would you say if it wasn't, say (sic), $200 and was, say, $20? (Which seems like a much more realistic if not still a bit high valuation of DLSS3 development cost inside Ada's overall "BOM").
It wouldn't matter. The question is, is DLSS3 a feature people would pay extra for or not.
 
Criticism (even when it is somewhat excessive..) should be welcome.
It keeps companies and individuals honest, and if a product is good it is very likely it will eventually shine through.

Having used DLSS3 I can say that, IMHO, the vast majority of users will switch it on, enjoy the much smoother experience, and never go back.
 
Exactly! They never ever made a point that NVIDIA GPUs have this feature that reduces latency in dozens of games, a feature that is not available to AMD GPUs at all.

Furthermore, the inclusion of DLSS3 will accelerate the adoption of the Reflex feature immensely, which will increase the advantage of NVIDIA GPUs, yet no body will talk about this, uncless in the context of DLSS3.

Instead we have stuff like this.

As pointed out in the thread, HUB does those style of thumbnails often and they're not exclusively making AMD look good or NVIDIA bad. Both are presented in both good and bad lights depending on which videos thumbnail you pick.
 
I wonder if a future interesting application would be just selective frame generation. As in the generated frames are only used essentially if no next "real" frame is available.

Interestingly enough, VR has been doing this for half a decade, and it's (for the most part) a solved problem.

Both Oculus and the SteamVR stacks do this automatically for any title that uses their APIs:


 
Back
Top