Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

Seems like DLSS cost in frametime will be significantly lowered because of the new concurrency stuff. This should be good knows for high framerate situations where the relative cost of DLSS became more significant.

I'm pretty clueless about all things rendering, but I'm surprised DLSS processing can be started before the frame is fully rendered.

NVIDIA-GeForce-RTX-30-Tech-Session-00037_79466A1F681E49E28894E588A0F52994.jpg

Aren't the tensor cores used for RT denoising? If so then this could be referring to that rather than DLSS. Note it happens at exactly the same time as the RT work (with a small lag at the start).
 
Aren't the tensor cores used for RT denoising? If so then this could be referring to that rather than DLSS. Note it happens at exactly the same time as the RT work (with a small lag at the start).
They can be used for denoising, but I don't think there's a single implementation doing that (outside possibly NVIDIA demos)
 
So it seems Ultra Performance mode can be applied at any resolution for "near native" quality according to Nvidia. So that means you can run the game internally at 720p and upscale to 4k. Insane!
 
Apex Legends will support DLSS. Fuck yes. I can get rid of their shitty TAA implementation and run with DLSS instead.

Edit: NO! Just realized the bar doesn't show a dlss gain for Apex. Performance is just there as an indicator of 8k performance. God damnit. Guess I'll be trying out DSR, but I heard that doesn't work well and is blurry too.
 
Apex Legends will support DLSS. Fuck yes. I can get rid of their shitty TAA implementation and run with DLSS instead.

Edit: NO! Just realized the bar doesn't show a dlss gain for Apex. Performance is just there as an indicator of 8k performance. God damnit. Guess I'll be trying out DSR, but I heard that doesn't work well and is blurry too.
Some games may not have the patch/driver yet, so don't display the dlss gain. My guess is they should be patched by the 17th.
 
Apex Legends will support DLSS. Fuck yes. I can get rid of their shitty TAA implementation and run with DLSS instead.

Edit: NO! Just realized the bar doesn't show a dlss gain for Apex. Performance is just there as an indicator of 8k performance. God damnit. Guess I'll be trying out DSR, but I heard that doesn't work well and is blurry too.

I use DSR all the time and have never had a problem with it. Works perfectly IMO, and you can scale adjust the level of blur applied in the control panel.

I was thinking actually that NV will need to add additional DSR levels for 8k since it currently only goes up to 2x on each axis.
 
I'm afraid including temporal component into DL'SS' will result in not so sharp image in movements. And that's something we all hate about TAA.
 
I use DSR all the time and have never had a problem with it. Works perfectly IMO, and you can scale adjust the level of blur applied in the control panel.

I was thinking actually that NV will need to add additional DSR levels for 8k since it currently only goes up to 2x on each axis.

Even with the adjustable blur factor I stick to 2x with no blur for best results. It would be sweet if they enable 8K DSR + DLSS. Render at 1440p, DLSS to 8K then downsample to 4K.
 
It would be sweet if they enable 8K DSR + DLSS. Render at 1440p, DLSS to 8K then downsample to 4K.
It's kinda silly that it's necessary to play around with DSR to enable higher quality. I wish they'd just unlock higher base resolutions. They already work up to 93,3% in Control which would suggest the 67% max is just an arbitrary limit that can be easily removed.

I'll be pretty disappointed if we're stuck with 67% max on Ampere. :cry:
 
I use DSR all the time and have never had a problem with it. Works perfectly IMO, and you can scale adjust the level of blur applied in the control panel.

I was thinking actually that NV will need to add additional DSR levels for 8k since it currently only goes up to 2x on each axis.

I haven't played around with it yet, but I'd read that you needed 2x scaling (4k -> 1080p) to get really clean results. If you do 1.33 scaling (1440 -> 1080), for example you have to play around with the blur coefficient and results are not necessarily great.
 
They already work up to 93,3% in Control which would suggest the 67% max is just an arbitrary limit that can be easily removed.
Isn't Control using the version of DLSS that only used compute on SMs and was the first test of the final algorithm that went into DLSS 2.0?
 
I haven't played around with it yet, but I'd read that you needed 2x scaling (4k -> 1080p) to get really clean results. If you do 1.33 scaling (1440 -> 1080), for example you have to play around with the blur coefficient and results are not necessarily great.

I guess others may be more sensitive to it than me but I have a 1080p panel and DLSS up to 1440p with the default 33% blur and it looks great to me, way better than 1080p and not noticeably more blurred to my eyes.
 
Back
Top