Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

Any details on what, if any, developers need to do to support DLSS with DRS ? Is it just 1 more input to it or does the Nvidia DLSS handle all that with nothing further required by devs?
 
Marvel's Avengers seem to be the first game with DLSS which gets the new dynamic resolution scaling capability of DLSS 2.1 implemented:
https://steamcommunity.com/app/997070/discussions/0/3105763714507091254/

I'd love to understand how this works in more detail. I.e. is the output resolution fixed and the DLSS quality level variable, or is the DLSS quality level fixed and the output resolution variable. In either case it should result in some massive performance increases.
 
I'd love to understand how this works in more detail. I.e. is the output resolution fixed and the DLSS quality level variable, or is the DLSS quality level fixed and the output resolution variable. In either case it should result in some massive performance increases.
Output resolution is fixed and input varies. Mip level should vary accordinly.

Dynamic resolution is covered in the DLSS programming guide found in Github. Need access to view this, but it's easy to get (link epic account to github account iirc):

https://github.com/NvRTX/UnrealEngi...ty/NGX/Doc/DLSS_Programming_Guide_Release.pdf
 
Output resolution is fixed and input varies. Mip level should vary accordinly.

Dynamic resolution is covered in the DLSS programming guide found in Github. Need access to view this, but it's easy to get (link epic account to github account iirc):

https://github.com/NvRTX/UnrealEngi...ty/NGX/Doc/DLSS_Programming_Guide_Release.pdf

Cheers, so performance gains should be in line with DRS (probably less due tot he DLSS overhead) but output quality should be more consistently better. Arguably its giving most of the benefit with virtually none of the cost of DRS.

I can't seem to access the guide myself despite having a Github account. Is there any word in there around how it acts when running at native resolution is possible? Or even close to native? i.e. does DLSS switch off at that point or keep applying at least the AA component?
 
Is there any word in there around how it acts when running at native resolution is possible? Or even close to native? i.e. does DLSS switch off at that point or keep applying at least the AA component?
No, not really.

An excerpt from the Dynamic resolution chapter:

To use DLSS with dynamic resolution, initialize NGX and DLSS as detailed in section 5.3. During the DLSS Optimal Settings calls for each DLSS mode and display resolution, the DLSS library returns the “optimal” render resolution as pOutRenderOptimalWidth and pOutRenderOptimalHeight. Those values must then be passed exactly as given to the next NGX_API_CREATE_DLSS_EXT() call.

DLSS Optimal Settings also returns four additional parameters that specify the permittable rendering resolution range that can be used during the DLSS Evaluate call. The pOutRenderMaxWidth, pOutRenderMaxHeight and pOutRenderMinWidth, pOutRenderMinHeightvaluesreturned are inclusive: passing values between as well as exactly the Min or exactly the Max dimensions is allowed.

I don't see anything that would reveal what this "permittable rendering resolution range" is. Also does passing values exactly the Max dimensions for instance mean "rounding up is ok" or "go nuts", I'm not sure.

I'm not confident I understand what's being said in the latter paragraph in the first place actually... :LOL:

edit: yes I misunderstood. I thought passing meant exceeding. But what's being said is Min, Max and any value in between is okay. Exceeding Min & Max isnt.
 
Last edited:
1356944963834101760

He did fast test on coming temporal upsampling version from UE4.
Looks quite nice already.
 
I am not a fan of dynamic resolution...

Its a pc gamer thing mostly i think, i rather play at whatever resolution set too without it going down and up. Apex legends has a not-so-good implementation and there problably exists better ones, but in general i think it might be worth it to have some kind of dynamic resolution scaling going on if it improves performance.
I rarely need it, though (2080Ti).
 
Dynamic res on PC would make sense if it would kick in only when a game is reaching some user defined lowest acceptable fps instead of working as a console setup where it keeps performance at some fps target and kicks is when a game is going below it.
 
Dynamic res on PC would make sense if it would kick in only when a game is reaching some user defined lowest acceptable fps instead of working as a console setup where it keeps performance at some fps target and kicks is when a game is going below it.

Agree, that would be much more intresting. Anyway, somehow the above antialiasing discussion reminds of me of Quincunx. I think it was possible to apply it to just about any game. It would make the image look softer, but much better overall (removed aliasing quite abit), at virtually no performance cost using a Ti4200.
 
Its a pc gamer thing mostly i think, i rather play at whatever resolution set too without it going down and up. Apex legends has a not-so-good implementation and there problably exists better ones, but in general i think it might be worth it to have some kind of dynamic resolution scaling going on if it improves performance.
I rarely need it, though (2080Ti).

Agreed, if I wanted low/med settings with resolution hacks I would boy a toybox.
 
1356944963834101760

He did fast test on coming temporal upsampling version from UE4.
Looks quite nice already.

When they say 60-80% resolution I assume they mean total pixels rather than 60-80% on each axis? In any case the results looks really good for a basic upscale. The most noticeable difference for me is in the blurriness of the long grass at the forefront of the image. DLSS is still clearly technically better but I can imagine that that this will be more than good enough for most people, especially at TV viewing distances. Then again, you could arguably say the same about 1440p vs 4k.
 
When they say 60-80% resolution I assume they mean total pixels rather than 60-80% on each axis? In any case the results looks really good for a basic upscale. The most noticeable difference for me is in the blurriness of the long grass at the forefront of the image. DLSS is still clearly technically better but I can imagine that that this will be more than good enough for most people, especially at TV viewing distances. Then again, you could arguably say the same about 1440p vs 4k.
Hard to know what is meant here by that percentage. If it is going by how UE4 does its metrics... the percentage is the axis percentage.
 
When they say 60-80% resolution I assume they mean total pixels rather than 60-80% on each axis? In any case the results looks really good for a basic upscale. The most noticeable difference for me is in the blurriness of the long grass at the forefront of the image. DLSS is still clearly technically better but I can imagine that that this will be more than good enough for most people, especially at TV viewing distances. Then again, you could arguably say the same about 1440p vs 4k.
but there's no temporal shimmer on DLSS IIRC.
 
but there's no temporal shimmer on DLSS IIRC.

Good point. *must remember to stop comparing image quality from stills*

Hard to know what is meant here by that percentage. If it is going by how UE4 does its metrics... the percentage is the axis percentage.

Pretty impressive if so given this would equate to a larger source to output ratio than DLSS quality mode.
 
Back
Top