Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

OT : I don't think money solves all the problems when time seems the most critical one.
If you're willing to pay for more depth, then basically you're willing to pay for extra content. ie; they'd have to release a second version 4 days after that is more thorough.

but you're right, it's a time issue for them.
 

Will the S fix it?

I'm still not seeing how you're drawing the equivalence. The 2060 is running at max PC settings which is most closely equated to PS5's quality mode (there is no exact settings match). In quality mode the PS5 runs almost consistently significantly below 4K and in those scenes that are most taxing can drop as low as 1440p. You're comparing this to a DLSS output of fixed 4K?


In fact if you want a direct comparison you need look no further than the same Nioh2 video you posted above, but look to the 1440p DLSS section. This represents equivalence with the lowest resolution the PS5 drops to. i.e. those scenes where it gets so close to 60fps that it has to drop all the way back to 1440p to stay above. If the 2060 using DLSS where clearly slower then we should see it dropping below 60fps, or at least getting very close to it in this mode shouldn't we? And yet what we see in the video and what Alex explains is that it's always comfortably above 60fps. This is far from conclusive, but based on the available evidence it's reasonable to assume that the 2060 using DLSS quality mode is performing at least as well as the PS5. And naturally a 2060S would perform noticeably better (to the tune of around 12% according to TPU).

It's also worth noting that the performance uplift from DLSS isn't actually that huge in this scenario. Only 28% according to Alex, whereas more optimal scenario's can see performance uplifts over 100%.
 
I'm still not seeing how you're drawing the equivalence. The 2060 is running at max PC settings which is most closely equated to PS5's quality mode (there is no exact settings match). In quality mode the PS5 runs almost consistently significantly below 4K and in those scenes that are most taxing can drop as low as 1440p. You're comparing this to a DLSS output of fixed 4K?


In fact if you want a direct comparison you need look no further than the same Nioh2 video you posted above, but look to the 1440p DLSS section. This represents equivalence with the lowest resolution the PS5 drops to. i.e. those scenes where it gets so close to 60fps that it has to drop all the way back to 1440p to stay above. If the 2060 using DLSS where clearly slower then we should see it dropping below 60fps, or at least getting very close to it in this mode shouldn't we? And yet what we see in the video and what Alex explains is that it's always comfortably above 60fps. This is far from conclusive, but based on the available evidence it's reasonable to assume that the 2060 using DLSS quality mode is performing at least as well as the PS5. And naturally a 2060S would perform noticeably better (to the tune of around 12% according to TPU).

It's also worth noting that the performance uplift from DLSS isn't actually that huge in this scenario. Only 28% according to Alex, whereas more optimal scenario's can see performance uplifts over 100%.

Nope, the PS5's quality mode runs higher settings than PC. The PC's highest settings are the same as the PS5's 4k mode and that is mostly at 4k. The 2060 is not hitting consistent 60fps while also only rendering at quarter the PS5's native resolution and upscaling to 4k.
 
Nope, the PS5's quality mode runs higher settings than PC. The PC's highest settings are the same as the PS5's 4k mode and that is mostly at 4k. The 2060 is not hitting consistent 60fps while also only rendering at quarter the PS5's native resolution and upscaling to 4k.

This isn't true. At highest settings the PC's shadows are the closest match for the PS5's standard/quality mode. Meanwhile the PC's LOD (unchangable) is closest to the PS5's 4K mode. So at best the PC settings are between the PS5's 4K and standard modes but I'd argue that shadow quality usually has a bigger hit on performance than LOD level.

Regardless, the 4K mode still tends to run around 1872p according to the VG Tech video I posted above (see the comments) while it can still drop as low as 1440p in the most demanding scenes.

So again, how is this comparable to fixed 4k DLSS output?

And as pointed out several times above, the native resolution of DLSS is entirely irrelevant for performance comparisons because the DLSS step itself has an additional performance cost.
 
This isn't true. At highest settings the PC's shadows are the closest match for the PS5's standard/quality mode. Meanwhile the PC's LOD (unchangable) is closest to the PS5's 4K mode. So at best the PC settings are between the PS5's 4K and standard modes but I'd argue that shadow quality usually has a bigger hit on performance than LOD level.

Regardless, the 4K mode still tends to run around 1872p according to the VG Tech video I posted above (see the comments) while it can still drop as low as 1440p in the most demanding scenes.

So again, how is this comparable to fixed 4k DLSS output?

And as pointed out several times above, the native resolution of DLSS is entirely irrelevant for performance comparisons because the DLSS step itself has an additional performance cost.

The DLSS cost is 2 or 3 ms on a 2060 according to NV. There is surely some game to game variation but it's unlikely to be substantial I would think? Anyone with more correct info plz jump in. Increasing to PS5 resolutions would incur a larger hit than that. DLSS looks better regardless so I'm unsure why this title is even very interesting.
 
The DLSS cost is 2 or 3 ms on a 2060 according to NV. There is surely some game to game variation but it's unlikely to be substantial I would think? Anyone with more correct info plz jump in. Increasing to PS5 resolutions would incur a larger hit than that. DLSS looks better regardless so I'm unsure why this title is even very interesting.
2 or 3 ms is fairly significant.

You need to render at 76 fps (13.6ms) to make it to 16.6ms after DLSS.
Or if you hit 16.6ms render time, you are now 19.6ms. Essentially you dip to 51 fps

I do agree it would incur larger penalties to perform the feat native. Hence the reason why DLSS is used in the first place.
 
In my opinion, comparing RTX GPUs to PS5 without the use of RT is a moot point, most AAA titles will have some sort of RT baked in at this point, which in turn means the 2060 GPU will easily surpass any game on PS5 when used to it's full potential, meaning with RT on and DLSS on.
 
In my opinion, comparing RTX GPUs to PS5 without the use of RT is a moot point, most AAA titles will have some sort of RT baked in at this point, which in turn means the 2060 GPU will easily surpass any game on PS5 when used to it's full potential, meaning with RT on and DLSS on.
Was comparing GTX GPUs to their AMD equivalents also pointless when not using lower level APIs? How about SAM?
 
If > or = IQ is required that invalidates a lot of DLSS testing as more often than not IQ is worse.
Nope:

Now, we may observe that one technique consistently produces either (a) subjectively better-looking results or (b) objectively closer-to-ground-truth results than another. I would say this is true of checkerboarding vs. native/TAA. Therefore, the consistent inferiority of checkerboarding must be taken into account in any comparisons. However, DLSS-vs-native/TAA is a different story. They produce different results. Both are approximations, neither is perfect. Whether one appears "better" than the other is subjective (and none of us have objective mean-square errors vs. ground-truth images) and depends on the observer. And so perhaps one can make the argument that it's reasonable to consider them iso-quality while focusing on performance differences.
 
So IQ in fact doesn't have to be the same for comparisons when it favors Nvidia?

This is the point I've been trying to make all along. Unfortunately some believe that it's only fair to compare the upscaling technologies when it's the PC's upscaling tech.

@pjbliverpool I'll prepare a response, but it'll likely be during the mid week. If you're unable to see the PS5 genuinely performing better here, then I doubt think we can have a rational discussion any longer.
 
In my opinion, comparing RTX GPUs to PS5 without the use of RT is a moot point, most AAA titles will have some sort of RT baked in at this point, which in turn means the 2060 GPU will easily surpass any game on PS5 when used to it's full potential, meaning with RT on and DLSS on.

Are you absolutely certain that the AMD GPUs are currently making the best use of their RT capabilities and changes to how those function won't get better over time?
 
We already know from some devs that DXR is limiting what they can do on PC (on both amd and nvidia) versus what they're allowed to do on PS5 (maybe xbox too, I don't remember). In that case it's, for now, an API "battle", but I guess DXR will evolve to, like DX did / does.
 
So IQ in fact doesn't have to be the same for comparisons when it favors Nvidia?

There are review sites who are disabling raytracing because of the performance impact. So DLSS is a different way to increase performance. IQ doesnt need to be equal, because that is a cop out to disable features for a subjective parity.
 
Back
Top