Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

Well they are different pipelines because it's impossible to run the "exact same shaders" on different sets of hardware.
Again, the fact that you can access additional functions on DXR on Xbox doesn't mean that the DXR pipelines on Xbox and PC are different.
As an example D3D12 FL12_0 and FL12_1 are the same pipeline, the latter has additional functions in its stages.
And unless there are more shader types / pipeline stages in Xbox DXR the pipelines are likely the same as well.
 
Again, the fact that you can access additional functions on DXR on Xbox doesn't mean that the DXR pipelines on Xbox and PC are different.
As an example D3D12 FL12_0 and FL12_1 are the same pipeline, the latter has additional functions in its stages.
And unless there are more shader types / pipeline stages in Xbox DXR the pipelines are likely the same as well.

D3D12 FL 12_0 and FL 12_1 technically do have different pipelines. The rasterization stage in the latter is modified to have over-estimate conservative rasterization. The pixel shader stage in the latter also have access to a new resource type not found in the former.

Also I'm not even sure if ray tracing shader stages as defined with PC DXR have any real meaning on Xbox DXR. PC DXR pipeline is currently a subset of Xbox DXR pipeline for a more accurate description. Console developers are doing things not possible with PC DXR such as custom traversal, ray-caching, access to Blas leaf triangles, etc ...
 
Just so everyone is on the same page, this is what I was referring to:
"[Series X] goes even further than the PC standard in offering more power and flexibility to developers," reveals Goossen. "In grand console tradition, we also support direct to the metal programming including support for offline BVH construction and optimisation. With these building blocks, we expect ray tracing to be an area of incredible visuals and great innovation by developers over the course of the console's lifetime."
https://www.eurogamer.net/articles/digitalfoundry-2020-inside-xbox-series-x-full-specs
 
Which test was that? I don't recall seeing one where a 2060s is losing out at the same output resolution while also using DLSS.

I didn't say anything about output resolution since both have an output resolution of 2160p, with a mixture of scaling techniques depending on the platform.

In Alex's video he shows a 2060 using DLSS Performance Mode at 2160p (internal resolution is 50% of 2160p) and DLSS Quality at 1440p (internal resolution of 60% of 1440p). Internal resolution scales depending on which mode is selected.

I hope you're able to agree that the Performance Mode of DLSS is inferior to the Quality Mode of DLSS and definitely inferior to native rendering?

In the video, Alex stated that the DLSS Performance Mode was undesirable since it was getting framerates in the low 40s. I saw a low of 41FPS in the video, presumably there are instances where it drops lower . He recommends using the DLSS Quality Mode at 1440p (internal 60% of native) for a consistent 60fps. As I'm sure you know, GPUs need to have an average framerate of much higher than 60 in order to maintain that framerate/frametime. You included a screenshot in one of your replies that showed an average frame of ~60fps (can't remember the exact number). I don't think you need me to tell you that a 60fps average is not a 60fps game, as you're getting dips much lower and no vsync/cap applied. So non-vsynced/capped average of 60fps is not equivalent to a vsynced/capped 60fps.

I hope you can agree that the PS5's average framerate it likely significantly higher than 60, but unfortunately we're unable to measure higher than 60 due to the correctly applied vsync and framerate cap.

In the VG Tech video you posted, it was stated that the 4k PS5 mode (i.e., the closest match to the highest setting on PC) that there were three resolutions on the PS5; 2160p, 1872p, and 1440p. With the "common" resolution being 3328x1872. the guy that created that video posts here (@VGA), so maybe he can clarify if the resolution drops are tied to the framerate drops on PS5. As we know, some games drop resolution by area, not by frametime. I'd pixelcount it myself if I knew how.

Anyway, I'll go with the assumption that 1440p is the genuinely lowest native resolution (DF counted a much higher low resolution in their video) and I'll also go with the idea that the lowest resolution is tied to framerate. I notice a low of 54fps in the DF video, they assume that it's a streaming issue, rather than a rendering problem. VG Tech get a low of 57, also looks like streaming as no action happens and the character is moving though the level. So arguably, the PS5 does not drop frames due to rendering load, I'm sure you'll disagree, but that's what the videos show and the creators seem to suggest.

So we have (at least) three internal resolutions of the PS5, as follows:

3840x2160 which calculates as 8,294,400 pixels/frame
3328x1872 which calculates as 6,230,016 pixels/frame
2560x1440 which calculates as 3,686,400 pixels/frame

The 2060 has two resolutions, as follows:

50% of 3840x2160 which calculates as 4,147,200
60% of 2560x1440 which calculates as 2,211,840

Since the 2060 doesn't have dynamic resolution scaling we have to compare the pixels over time. So when the 2060 is running the higher resolution it drops to 41fps, over the course of 1 second the hardware has a render resolution of 170,035,200 pixels.

By comparison the PS5 renders at 2560x1440 at 60fps (at its lowest), which is 221,184,000 pixels over a second. I know you'll suggest that the lowest framerate of the PS5 is actually 54fps (contrarian to the videos), so that calculates as 199,065,600.

This is the best possible scenario for the 2060 and it's still behind by 15%. and it's very likely that the PS5 is hitting a streaming problem with its own low. It'd be interesting to see the PS5 render in that same scenario, to check the framerate and resolution to calculate the total pixels over time, but it's likely that the PS5 is still framecapped to 60fps, so capacity to render more frames and probably at a higher resolution.

The worst possible scenario for the 2060 is the PS5 rendering at full resolution at 60fps (although framerate is likely a lot higher than this and is vsynced/capped); 497,664,000 pixels/second.

Is the image quality as good as PS5 when using DLSS Performance at 2160p (internal 50% of 4k), with lows of 41fps? The answer is no. The image quality of the Performance Mode isn't a match for native resolution and the ~20fps deficit is insufficient.

Is the image quality as good as PS5 when using the DLSS Quality mode at 1440p (internal 60% of 1440p) with a matched framerate of 60? No, as the internal resolution is significantly lower than the PS5 and even the scaled DLSS output resolution is much lower.
 
Last edited by a moderator:
Since the 2060 doesn't have dynamic resolution scaling we have to compare the pixels over time. So when the 2060 is running the higher resolution it drops to 41fps, over the course of 1 second the hardware has a render resolution of 170,035,200 pixels.

Except that's not true, because DLSS takes up to 3ms to run which takes away from the frametime that rasterisation can take place in. In other words, if the card didn't have to do DLSS pass after the rasterisation pass, it would be much faster. Instead of doing 41-60 fps, it would do 50-75fps with the same settings.

And then there's the fact that the 2060S was mentioned, which is 15% faster then the non-S version shown in the video. So 57-86 fps in the same situation.
 
Except that's not true, because DLSS takes up to 3ms to run which takes away from the frametime that rasterisation can take place in. In other words, if the card didn't have to do DLSS pass after the rasterisation pass, it would be much faster. Instead of doing 41-60 fps, it would do 50-75fps with the same settings.

You think an extra 3ms will equate to an extra 10-15fps? 41fps has a frame time of 24.39024.

You're wrong on your second point too, 41/100*15=6.25, so the framerate in the same situation would be 47fps.
 
You think an extra 3ms will equate to an extra 10-15fps? 41fps has a frame time of 24.39024.

You're wrong on your second point too, 41/100*15=6.25, so the framerate in the same situation would be 47fps.

Yes depending on the frametime it makes a huge difference. 100 fps is 10 ms frametime and 10-3 = 7ms frametime equates 142 fps just as an (extreme in this case but not actually that extreme) example.
in regards to 47 to 50 I may have made a typo or bad rounding somewhere, not tha it makes a difference. Comparing lowest seen framerate on the PC is disingenous at best anyway.
 
I hope you can agree that the PS5's average framerate it likely significantly higher than 60, but unfortunately we're unable to measure higher than 60 due to the correctly applied vsync and framerate cap.

No I don't agree with this. This is specifically what VRS is designed to eliminate. When framerate is significantly higher than 60fps then resolution should be maxed out (so there may be instances at 4K where this is the case), however at every resolution lower than 4K, the game is running at less than 4K because it can't achieve 60fps at that resolution and thus if the VRS is working properly, it should be dropping resolution just enough to keep the framerate just above 60fps.

So in those instances where the PS5 has to run at 1872p, or even 1440p, we should assume it's not running at much more than 60fps, or else it would also be running at a higher resolution.

In the VG Tech video you posted, it was stated that the 4k PS5 mode (i.e., the closest match to the highest setting on PC)

I also disagree with the this. The PS5's 4K mode has significantly inferior shadow quality to the PC's max settings so it's not a direct point of comparison. Nor is the PS5 standard mode which has the same shadows but higher LOD than the PC. SO a direct comparison is somewhere between those two modes.

maybe he can clarify if the resolution drops are tied to the framerate drops on PS5. As we know, some games drop resolution by area, not by frametime.

We know this not to be the case from the DF video which shows realtime resolution drops within a scene from a simple camera movement.

So we have (at least) three internal resolutions of the PS5, as follows:

3840x2160 which calculates as 8,294,400 pixels/frame
3328x1872 which calculates as 6,230,016 pixels/frame
2560x1440 which calculates as 3,686,400 pixels/frame

The 2060 has two resolutions, as follows:

50% of 3840x2160 which calculates as 4,147,200
60% of 2560x1440 which calculates as 2,211,840

This is the biggest issue I have with your analysis. Why are you making a comparison of the PS5 to a 2060 with DLSS but then using the internal rendering resolution rather than the DLSS output resolution as you measuring stick for how much work is being done? You're either comparing to a 2060 with DLSS in which case the comparison resolution is the DLSS output resolution, in this case 4K, or you're comparing to a 2060 without DLSS in which case you can compare to the native/internal resolution (which for the record would be lower than what you state above). But no-ones arguing that the PS5 won't be faster in that latter case so it's a moot point.

Since the 2060 doesn't have dynamic resolution scaling we have to compare the pixels over time. So when the 2060 is running the higher resolution it drops to 41fps, over the course of 1 second the hardware has rendered 170,035,200 pixels.

By comparison the PS5 renders at 2560x1440 at 60fps (at its lowest), which is 221,184,000 pixels over a second. I know you'll suggest that the lowest framerate of the PS5 is actually 54fps (contrarian to the videos), so that calculates as 199,065,600.

Except the pixels drawn on screen by a 2060 with DLSS in 4k performance mode aren't 4,147,200, they're 8,294,000. So using you're own math that's 340,054,000 pixels per second at the 41fps low. Well in excess of the PS5 numbers you calculated above.

To be clear, I don't think this is a good way of directly comparing performance so I'm not drawing any specific conclusions from it other than to say the evidence doesn't strongly suggest the PS5 is outperforming a 2060 in DLSS 4K performance mode here, let alone a 2060S, and certainly not by a clear and substantial margin.

The worst possible scenario for the 2060 is the PS5 rendering at full resolution at 60fps (although framerate is likely a lot higher than this and is vsynced/capped)

Why is that the worst scenario for the 2060? There are clearly scenes where the 2060 at 4K DLSS(P) is running at over 60fps. It's likely these are the same scenes that the PS5 is also able to run at it's highest resolution of 4K.

Is the image quality as good as PS5 when using DLSS Performance at 2160p (internal 50% of 4k), with lows of 41fps? The answer is no. The image quality of the Performance Mode isn't a match for native resolution and the ~20fps deficit is insufficient.

Is the image quality as good as PS5 when using the DLSS Quality mode at 1440p (internal 60% of 1440p) with a matched framerate of 60? No, as the internal resolution is significantly lower than the PS5 and even the scaled DLSS output resolution is much, much lower.

This is a different argument. Your previous argument is that the PS5 is outperforming the 2060(S) even with DLSS engaged. Essentially saying that the PS5 can match whatever output resolution DLSS is offering and still deliver higher frame rate. The question of whether said DLSS mode is offering superior image quality to native is an altogether different one.
 
Last edited:
No I don't agree with this. This is specifically what VRS is designed to eliminate. When framerate is significantly higher than 60fps than resolution should be maxed out (so there may be instances at 4K where this is the case), however at every resolution lower than 4K, the game is running at less than 4K because it can;t achieve 60fps at that resolution and thus if the VRS is working properly, it should be dropping resolution just enough to keep the framerate just above 60fps.

So in those instances where the PS5 has to run at 1872p, or even 1440p, we should assume it's not running at much more than 60fps, or else it would also be running at a higher resolution.



I also disagree with the this. The PS5's 4K mode has significantly inferior shadow quality to the PC's max settings so it's not a direct point of comparison. Nor is the PS5 standard mode which has the same shadows but higher LOD than the PC. SO a direct comparison is somewhere between those two modes.



We know this not to be the case from the DF video which shows realtime resolution drops within a scene from a simple camera movement.



This is the biggest issue I have with your analysis. Why are you making a comparison of the PS5 to a 2060 with DLSS but then using the internal rendering resolution rather than the DLSS output resolution as you measuring stick for how much work is being done? You're either comparing to a 2060 with DLSS in which case the comparison resolution is the DLSS output resolution, in this case 4K, or you're comparing to a 2060 without DLSS in which case you can compare to the native/internal resolution (which for the record would be lower than what you state above). But no-ones arguing that the PS5 won't be faster in that latter case so it's a moot point.



Except the pixels drawn on screen by a 2060 with DLSS in 4k performance mode aren't 4,147,200, they're 8,294,000. So using you're own math that's 340,054,000 pixels per second at the 41fps low. Well in excess of the PS5 numbers you calculated above.

To be clear, I don't think this is a good way of directly comparing performance so I'm not drawing any specific conclusions from it other than to say the evidence doesn't strongly suggest the PS5 is outperforming a 2060 in DLSS 4K performance mode here, let alone a 2060S, and certainly not be a clear and substantial margin.



Why is that the worst scenario for the 2060? There are clearly scenes where the 2060 at 4K DLSS(P) is running at over 60fps. It's likely these are the same scenes that the PS5 is also able to run at it's highest resolution of 4K.



This is a different argument. You're previous argument is that the PS5 is outperforming the 2060(S) even with DLSS engaged. Essentially saying that the PS5 can match whatever output resolution DLSS is offering and still deliver higher frame rate. The question of whether said DLSS mode is offering superior image quality to native is an altogether different one.

With respect, I don't think we can discuss this any more. I literally don't have the time to continually go around in circles and explaining every point.
 
No, it isn't.

Wrong again.

Not sure if you're kidding here or... (don't make me say it)

100 frames-per-second times 10 milisecods-per-frame = 1000 miliseconds = 1 second.
142 frames-per-second times 7 milisecods-per-frame = 994 miliseconds ~ 1 second.

This is how we've always measured performance

No, it's average framerate which drives gaming experience mostly and to a lower degree 99% percentile mostly, none of which are the number you provided. Besides taking lowest fps on PC when comparing hardware is disingenous becase it is basically a byproduct of worse optimization on PC, rather than hardware capabilities*.

*Specially on a game that uses dynamic resolution on console as a means to effectively mantain fps.
 
100fps is 10ms, but it's not contextually correct. You can't say that 10-3=7 and 7ms is 147fps, therefore the GPU can produce way more than 41fps.

I'm seriously at a loss here.

Also, just relating to other posts, if DLSS Performance outputting 4k is the same as outputting 4k, then why not only ever use 4k Ultra Performance, because that also outputs at 4k.

I'm fact, why do Nvidia even offer options if the output is the same? /s
 
Last edited by a moderator:
100fps is 10ms, but it's not contextually correct. You can't say that 10-3=7 and 7ms is 147fps, therefore the GPU can produce way more than 41fps.

It is 100% correct. If a 2060 is taking 10 ms (100fps) to render a frame with DLSS ON in a particular game/scene, it's doing the rasterisation part in 7 ms, which would translate to 143 fps. Not only that, DLSS ON is more than likely doing at least part of all the post effects at full resolution (4K), so actual framerate would be higher if doing the lower resolution entirely.

@ThePissartist Why is 10ms contextually incorrect? Legit curious.

It simply isn't.
 
It is 100% correct. If a 2060 is taking 10 ms (100fps) to render a frame with DLSS ON in a particular game/scene, it's doing the rasterisation part in 7 ms, which would translate to 143 fps. Not only that, DLSS ON is more than likely doing at least part of all the post effects at full resolution (4K), so actual framerate would be higher if doing the lower resolution entirely.



It simply isn't.

@Scott_Arm does this make sense to you?
 
This is obviously oversimplified, but it's what he's getting at.

View attachment 5334

720p + 30% would be well under 1440p. I only used performance mode as an example because the math is simple 1/4 resolution.

It may confuse him more tho. What we know here is performance with DLSS ON, so 10ms would be with DLSS inlcuded, 10ms - 3ms = 7ms instead ot the other way around (10ms + 3ms)

edit: You did add another data point in the middle tho :) for 77 fps with DLSS on it's 100 fps (+23 fps). Haha
 
Back
Top