Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

It may confuse him more tho. What we know here is performance with DLSS ON, so 10ms would be with DLSS inlcuded, 10ms - 3ms = 7ms instead ot the other way around (10ms + 3ms)

edit: You did add another data point in the middle tho :) for 77 fps with DLSS on it's 100 fps (+23 fps). Haha

Yah, I'm not going to redo it though ;) I think it demonstrates the general idea pretty well.
 
100fps is 10ms, but it's not contextually correct. You can't say that 10-3=7 and 7ms is 147fps, therefore the GPU can produce way more than 41fps.

I'm seriously at a loss here.

Also, just relating to other posts, if DLSS Performance outputting 4k is the same as outputting 4k, then why not only ever use 4k Ultra Performance, because that also outputs at 4k.

I'm fact, why do Nvidia even offer options if the output is the same? /s

I'd say this is where you need to better define your argument as the original claim was simply that the PS5 is outperforming a 2060(S) "with DLSS". What quality DLSS? What output resolution? I've made the assumption that you were referring to the 4K DLSS performance output presented by Digital Foundry on the basis that it couldn't maintain 60fps whereas the PS5 could, but if you're making a different argument then I'm happy to discuss it. Certainly higher quality levels and lower output resolutions reduce the performance uplift of DLSS. Alex said he measured only 28% uplift at 1440p quality mode which would put a 2060 just under 2070S performance or a 2060S into 2080S territory.
 
But if the game is rendering 41fps, that 3ms/frame is the same budget if it's rendering at 100fps. The majority of the processing is rendering the image, not upscaling to 4k.

So you can only gain 3ms/frame at 41fps not using DLSS. 3*41=123. If it takes 24.39024ms to render a frame, then you're only gaining 5fps by distributing that work to more native frames.
 
I'd say this is where you need to better define your argument as the original claim was simply that the PS5 is outperforming a 2060(S) "with DLSS". What quality DLSS? What output resolution? I've made the assumption that you were referring to the 4K DLSS performance output presented by Digital Foundry on the basis that it couldn't maintain 60fps whereas the PS5 could, but if you're making a different argument then I'm happy to discuss it. Certainly higher quality levels and lower output resolutions reduce the performance uplift of DLSS. Alex said he measured only 28% uplift at 1440p quality mode which would put a 2060 just under 2070S performance or a 2060S into 2080S territory.

Not sure the point you're trying to make here. The PS5 objectively outperforms the 2060, as I demonstrated.

The "/s" in the post you're replying to was signifying sarcasm.
 
The majority of the processing is rendering the image, not upscaling to 4k.

Not really, it depends on the final framerate. For a 2060 DLSS is a fixed 3ms frametime cost after the rendering is done*. This cost is not in any way related to how fast frames can be rendered, it's 3ms at 5 fps and 3ms at 1000fps, it "only" depends on the output resolution, in this case 4K. So a game which is running at 200 fps (5ms frametime) with DLSS ON, it is taking 2ms for rendering and 3ms for DLSS. If you don't do DLSS, you'd end up with 2ms frametime, which is 500 fps.

Of course reality is a bit more complicated than that, but in general terms and for non absurd frametimes like the ones above it does correlate to how it really is.
 
But if the game is rendering 41fps, that 3ms/frame is the same budget if it's rendering at 100fps. The majority of the processing is rendering the image, not upscaling to 4k.

So you can only gain 3ms/frame at 41fps not using DLSS. 3*41=123. If it takes 24.39024ms to render a frame, then you're only gaining 5fps by distributing that work to more native frames.
fps is just simple calculus (a measure of a specific instantaneous moment of frame time), not actual frames counted over the second.
If you're doing 41fps = 24.3ms.
If you subtract away a fixed cost of 3ms, you're at 21ms.
That'll get you to ~47 fps.

You're definitely going to see some sort of range.
But for 2060 to hold around 60fps with DLSS, it needs complete frame time rendering to be 11-12ms before the DLSS work kicks in. and 11-12ms is about 83-90fps.

For the sake of simplicity, I would not aggregate values over the second and then try to figure out the frame time, but rather just look at ms saved per frame. It's easy to mess up the math somewhere in there with aggregates.
 
Last edited:
fps is just a simple calculus of frame time, not actual frames counted.
If you're doing 41fps = 24.3ms.
If you subtract away a fixed cost of 3ms, you're at 21ms.
That'll get you to ~47 fps.

This is exactly what I've been saying.

I said that you gain 5 frames and you're calculating exactly the same thing... 41+5=47.
 
This is exactly what I've been saying.

I said that you gain 5 frames and you're calculating exactly the same thing... 41+5=47.
I think people may have had issues with following the aggregate math, I don't have a stake in here. I can't tell you what the frame time of a 2060 will be vs a 2060S.

I just think with respect to calculating frame time, it's best to just keep it as milliseconds per frame and go from there, it should reduce any confusion of the readers.

The 2060 S is clocked 15% faster and probably has more cores and more memory bandwidth in alignment with other Super editions ;they sit between the TI and regular editions.
So it would run 15% faster with more cores and more bandwidth could open more doors for performance. More VRAM may help as well.

The uplift on the super may be more than just clockspeed (in this case 15%).
And frankly, 15% improvement in clockspeed doesn't necessarily lead to 15% improvement either. We can perform some simple tests on our GPU by locking the clock frequencies and still be unable to find that exact linear relationship.
 
I was correcting the guy that claimed that removing DLSS processing was going to gain 9-15fps, whereas it factually only gains 5fps as both you and I have demonstrated.

Re: 2060S, not sure, we'd need to see it tested. I've put some numbers together earlier and by my calculations it doesn't get to consistent 60fps with DLSS Performance either. But yeah, we'd need to see if in action to be certain.
 
Not sure the point you're trying to make here. The PS5 objectively outperforms the 2060, as I demonstrated.

Again, in what context? If it's 1440P DLSS quality mode where the 2060 is gaining only 28% then I think you have an argument, although the margin is very small. If it's at 4K performance mode where the 2060 is gaining over 50%, then I don't think you've shown that at all, in fact I think the opposite is strongly suggested by the available evidence.

The "/s" in the post you're replying to was signifying sarcasm.

Nah I was referring to this were you explicitly referenced the 2060S:

There have been a number of assumptions in the past that the latest consoles are both approximately proportional to a Nvidia 2060S, despite both machines having more tflops available to them. It was reiterated in several DigitalFoundry videos, usually around games using ray tracing.

I get the impression that a few people were surprised (myself included) that the same comparison wasn't repeated in a pure rasterisation test when a console flat out outperformed the 2060S even when the best upscaling solution within the industry was used.
 
Again, in what context? If it's 1440P DLSS quality mode where the 2060 is gaining only 28% then I think you have an argument, although the margin is very small. If it's at 4K performance mode where the 2060 is gaining over 50%, then I don't think you've shown that at all, in fact I think the opposite is strongly suggested by the available evidence.



Nah I was referring to this were you explicitly referenced the 2060S:

By my calculation it doesn't get to consistent >60fps in Performance Mode DLSS either.

Anyone got one that can test the theory?
 
Last edited by a moderator:
I was correcting the guy that claimed that removing DLSS processing was going to gain 9-15fps, whereas it factually only gains 5fps as both you and I have demonstrated.

Re: 2060S, not sure, we'd need to see it tested. I've put some numbers together earlier and by my calculations it doesn't get to consistent 60fps with DLSS Performance either. But yeah, we'd need to see if in action to be certain.
The issue comes down to scaling though. So at 333fps adding DLSS will drop your frame time by 50% to 166ms. How much impact DLSS has with respect to the frame time is going to be how much that fixed time is, as a fraction of frame time. It's a little too easy to just choose arbitrary numbers to make 2060 look worse or better than it is with DLSS. It might just be best to see if someone with a 2060S is willing to benchmark the DLSS content for you.

Generally speaking, all GPUs have some variation in them. They will do better or worse with every title. As per the original vs PS5, which this thread is probably not the right one for, there are probably going to be times in which these 2 clash more than once for reasons that have nothing to do with performance of the systems. (I mean, the XSX loses out to a 5700 from time to time as well and functionally their bands of performance really shouldn't touch all that much) It's just an unfortunate bi-product of having games developed with different requirements, on different engines, by different talents of people, with different budgets and technology constraints.
 
I was correcting the guy that claimed that removing DLSS processing was going to gain 9-15fps, whereas it factually only gains 5fps as both you and I have demonstrated.

I said 9-15 fps for the 41-60fps range which is the low 41 and average 60 fps (and as I said I may have been slightly off due to misshandling rounding). We already corrected that 9 to 6 for the low 41 (yeah 6 not 5, if you're going to nitpick so will I :) ). That still doesn0t mean it's going to be the same for other frametimes. For higher than 60 fps the gain is actually higher as demonstrated to you several times in multiple different ways. It's +23 for 77fps, +43 for 100 fps and so on...
 
I was correcting the guy that claimed that removing DLSS processing was going to gain 9-15fps, whereas it factually only gains 5fps as both you and I have demonstrated.

Re: 2060S, not sure, we'd need to see it tested. I've put some numbers together earlier and by my calculations it doesn't get to consistent 60fps with DLSS Performance either. But yeah, we'd need to see if in action to be certain.

You simply can't measure the impact of DLSS like this so it's a pointless exercise. Yes we know how long the DLSS stage is supposed to take but there are other costs of using DLSS that this doesn't account for. This is easily provable by looking at the following timestamps that Alex handily put together.

Here we see the same scene at both 1440p native, and 4K DLSS Quality mode (1440p internal). Frame times are 14.5ms and 20ms respectively. i.e. 5.5ms difference, not the 2-3 being discussed here.



By my calculation is doesn't get to consistent >60fps in Performance Mode DLSS either.

Anyone got one that can test the theory?

But it doesn't have to because the output resolution is still fixed at 4K whereas the PS5 is using VRR. There is no direct comparison here from which to draw a conclusion. However as I pointed out earlier in the thread, a 2060S+50% (the measured performance uplift at 4K DLSS(P) = almost 6800 level performance so expecting the PS5 to be clearly outperforming that seems a little unrealistic at best.
 
You simply can't measure the impact of DLSS like this so it's a pointless exercise. Yes we know how long the DLSS stage is supposed to take but there are other costs of using DLSS that this doesn't account for. This is easily provable by looking at the following timestamps that Alex handily put together.

Here we see the same scene at both 1440p native, and 4K DLSS Quality mode (1440p internal). Frame times are 14.5ms and 20ms respectively. i.e. 5.5ms difference, not the 2-3 being discussed here.





But it doesn't have to because the output resolution is still fixed at 4K whereas the PS5 is using VRR. There is no direct comparison here from which to draw a conclusion. However as I pointed out earlier in the thread, a 2060S+50% (the measured performance uplift at 4K DLSS(P) = almost 6800 level performance so expecting the PS5 to be clearly outperforming that seems a little unrealistic at best.

If you're going to correct someone about frametime values of DLSS then you should direct it elsewhere. I responded to someone's incorrect calculations.

My personal view are that DLSS has a lot of worth, I just don't think it's as clearcut (in this scenario) as you're suggesting. I could see a GPU tested under two scenarios and in one the performance was lower and in the other the resolution was lower. Simple as that.

I stand by what I stated in my original post 100%. Prove me wrong with a demo and I'll happily stand corrected.
 
Folks perhaps it's time to wrap up this specific PS5-2060 discussion. People are going around in circles. I think we all appreciate the tricky nuances associated with such asymmetric comparisons. For the most part nobody is violently disagreeing with anyone else so let's just close this chapter.
 
Lovelace is coming, but by that time whatever MS does with DX12 U and XSX, will get Developer's attention.

Hardware is going to have to support Industry Standards, for wider adoption. I suspect NVidia knows that and why is starting to release their hardware specific "DLSS" for game engines.
 
Back
Top