Nvidia DLSS 3 antialiasing discussion

Funny enough, all the games that support DLSS 3 so far have it under then name of: Frame Generation. Yes! It's not called DLSS3 in game.
DLSS3 is a bundle of technologies coming together, including Reflex. It's a bit odd to call it DLSS3 like it's one single technology, when all the independent techs can be enabled or disabled individually (especially with Frame Generation working on native as well lol!).
 
Possible rumor ...

Edit:
If true then it would validate the statement made by the Nvidia engineer that FG could work on Ampere and Turing.
NVIDIA Engineer Says DLSS 3 on Older RTX GPUs Could Theoretically Happen, Teases RTX I/O News

In the same Twitter thread, he then explained why the technology will be exclusive to the upcoming NVIDIA GeForce RTX 4000 series.

DLSS 3 relies on the optical flow accelerator, which has been significantly improved in Ada over Ampere - it’s both faster and higher quality.

The OFA has existed in GPUs since Turing. However, it is significantly faster and higher quality in Ada, and we rely on it for DLSS3. [RTX 2000 and 3000] customers would feel that DLSS 3 is laggy, has bad image quality, and doesn’t boost FPS.


That said, Catanzaro left a door open to NVIDIA DLSS 3 potentially becoming compatible with GeForce RTX 2000 and 3000 series in the future, although he stressed it wouldn't yield the same benefits seen with the new graphics cards. As a reminder, DLSS 3 games will still provide DLSS 2 + Reflex support for GeForce RTX 2000 and 3000 owners.
 
Last edited:
DLSS3 is a bundle of technologies coming together, including Reflex. It's a bit odd to call it DLSS3 like it's one single technology, when all the independent techs can be enabled or disabled individually (especially with Frame Generation working on native as well lol!).

Good for marketing and the average consumer though because the publishers can just say the game supports DLSS 3, and we know is supports all 3 techs.

Also it's probably easier for NVIDIA to enforce the inclusion of all 3 techs this way (which is also good for consumers).

Some great questions around how it all works in this thread.

The way I hope to see it working on a gsync display is you just turn on vsync globally in the driver and then set a per game frame rate limit as needed which would be low enough for the CPU to deliver good frame pacing. That limit should then be doubled (or close) by frame generation up to the vsync limit. Anything below the vsync limit would be handled by gsync.

So on a 120hz screen, as long as your CPU can handle 60fps then you can cap the game at 60 and get a 120 output.
 
The vsync issue is concerning. I understand you can just ramp up settings to stay under it but that won't give you perfectly consistent frame times. I wonder if you can also set a frame rate limit below the vsync limit (say 55fps on a 110hz gsync screen for a solid 110 fps) or whether that would still incur the latency hit.
 
According to HW Unboxed, DLSS 3 is only really good to get 120 FPS to 240 FPS for higher refresh monitors. The much increased latency and obvious artefacts at 120 and especially 60 FPS makes DLSS unuseable at lower base framerates of 30 and 60 FPS, they say.

 
According to HW Unboxed, DLSS 3 is only really good to get 120 FPS to 240 FPS for higher refresh monitors. The much increased latency and obvious artefacts at 120 and especially 60 FPS makes DLSS unuseable at lower base framerates of 30 and 60 FPS, they say.

Yeah, your description there does not really jive at all with what I would say.
I find that a rather extreme appraisal.
For example, the difference in input latency I measured between DLSS2 to DLSS 3 in CP 2077 (Maxed 4K DLSS perfoance Mode) above or below vsync is ca. 10 milliseconds.
As a bit of perspective, 10 milliseconds of latency difference is 1/4th the latency difference between Doom 2016 on console vs. call of duty on console - both of which are 60 FPS games. I really think implying it has unplayable differences of latency being added requires some serious consideration and comparison about latency in games we play, as unless you are hitting Vsync Limit, DLSS 3 is not adding intense amount of visible latency: it is really intl the placebo realm.
 
DLSS3 is being praised from the next best thing after sliced breat, with a nice option to have as middleground downwards to introducing even more artifact potential and lag.

Coming from the middle ground here, I wonder whether people who really do like DLSS3 would actually pay for this, if it was an optional feature von RTX 4000, like for example in a locked feature in the driver and you pay, say, 5 bucks a month in order to get access to it (GFE-account with payment option). Or, on the other hand, people who really don't like it, would agree to permanently not have this option any more. Both under the premise of the card being a bit cheaper, say, 200$ less.
 
Coming from the middle ground here, I wonder whether people who really do like DLSS3 would actually pay for this, if it was an optional feature von RTX 4000, like for example in a locked feature in the driver and you pay, say, 5 bucks a month in order to get access to it (GFE-account with payment option). Or, on the other hand, people who really don't like it, would agree to permanently not have this option any more. Both under the premise of the card being a bit cheaper, say, 200$ less.
Do not give Jensen any ideas. The Bean Counters are reading this as we speak.
 
Back
Top