Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

I'd estimate it would sooner be a problem with motion vectors or surface Id's (assuming they are using those). If DLSS is just an interpolator then even trained on different content it should look pretty good.
 
Remnant: From the Ashes to support DLSS
August 19, 2019
Remnant: From the Ashes was confirmed a while ago to be featuring NVIDIA DLSS technology on PC.
Is that still the case and if so, will it be available at launch? What was your experience with DLSS and what kind of performance improvements can players expect?
We gave NVIDIA sample data for Remnant to run through their DLSS training network, and the
results should be automatically downloaded the first time you launch the game. We don’t have final
numbers for the performance improvements yet, though.

Did you look into adding real-time ray tracing or would that be something for your next game?
Ray tracing support was added to Unreal pretty late in our development, so we experimented with it but didn’t have time to properly take advantage of it. We do plan to look into it for our next game, though.
https://wccftech.com/remnant-from-t...-love-crossplay-targeting-4k-on-ps4-pro-xb1x/
 
When trying to determine if DLSS is better than standard upscaling.
DLSS is being compared to the absolute baseline you can do for upscaling. Why would fairness matter ?

And I would debate whether or not the upscaling uses less power (at least such a large margin that fairness is a discussion). The upscale uses a standard upscale algorithm but AA is still processed at that resolution and therefore a performance impact.

DLSS is trained from aliased images and then constructs the AA and the upsample. As I understand, It uses a modified render path for it to work: one that skips the AA step to produce raw aliased frames for DLSS yo work off.
 
DLSS is being compared to the absolute baseline you can do for upscaling. Why would fairness matter ?

And I would debate whether or not the upscaling uses less power (at least such a large margin that fairness is a discussion). The upscale uses a standard upscale algorithm but AA is still processed at that resolution and therefore a performance impact.

DLSS is trained from aliased images and then constructs the AA and the upsample. As I understand, It uses a modified render path for it to work: one that skips the AA step to produce raw aliased frames for DLSS yo work off.

Its possible that DLSS is being implemented differently here, but in all previous instances it incurred a non trivial performance cost making it substantially slower than standard upscaling from the same resolution. As for why it matters, what use is comparing DLSS IQ to something thats outputting 30% higher fps?
 
Its possible that DLSS is being implemented differently here, but in all previous instances it incurred a non trivial performance cost making it substantially slower than standard upscaling from the same resolution. As for why it matters, what use is comparing DLSS IQ to something thats outputting 30% higher fps?
I just assumed that is up to the users to decide. Those who want higher resolution and willing to trade off frame rate for it vs those who are not willing to trade off frame rate for better graphical quality.

The core of the debate around DLSS is how well it can get to it’s target with as little performance impact as possible. Users will naturally choose what they feel is a beneficial trade off for them.

It’s been this way for moving upwards in resolution above 1080p. You’re getting minor improvements at a cost of framerate. Some people can’t see the difference even if its measurably there. PC users get to choose the experience they want.
 
As for why it matters, what use is comparing DLSS IQ to something thats outputting 30% higher fps?
In the case of control, DLSS runs at the same speed of the regular upscaling solution.

Performance-wise, DLSS performs almost identically to the resolution which it is upscaled from. For example, 4K with DLSS set to 1440p runs almost identically to a native 1440p render. The same also applies to other resolutions. For the most part, DLSS enhanced images appear clearer, and smoother than normal resolution upscales, though the before mentioned issues/artifacts are worth remembering.
https://www.overclock3d.net/reviews/gpu_displays/control_rtx_raytracing_pc_analysis/7
 
Has anyone compared it to render scaling with the new sharpen filter on Control? So far it looks like a better solution than DLSS for some other titles.
 
Looking forward to the Digital Foundry performance review. Dictator answered that DLSS warrants analysis in a separate review from the core RTX review:

Seperate video I think at this point - I will touch on it though briefly. We got this game good and early, but Gamescom meant us not at all working on it... until this week. That means we cannot go 100% in without taking too much time, and it also means prioritising what is important for all versions (DX11 vs. DX12 perf is decidedly less important than the elefants in the room).

As I see it though, DLSS is more interesting of a story in this title than DX11 vs. DX12. DLSS looks good in this game.
https://www.resetera.com/threads/control-pc-performance-thread.137471/page-12#post-23996786
 
Haven't seen it in motion, but in stills it's still blurry
Fortunately we don't play games in stills. I'd be curious to know if this application of DLSS is any different from the existing applications in other DLSS games .... if anything is different regarding implementation on the developers or Nvidia's end.
 
Last edited by a moderator:
Yeah it looks a lot better than native res in this game and is pretty cheap - definitely something we want to look at further in another video. :D
IIIRC quantum break had this issue as well; their temporal solution was very blurry IIRC.
 
We have very little information of how devs can implement it, apart now that they can select native resolution.

Can they use it without motion vectors? Without surface IDs? Those will add a few bytes to the g-buffer and with naive implementations perhaps a significant performance penalty. (Though in principle it shouldn't be a big deal.)
 
IIIRC quantum break had this issue as well; their temporal solution was very blurry IIRC.

Yeah I think this is an important qualifier - native in Control is "native". I haven't seen any mention of an option to disable the upscaling, like Quantum break could do. Granted there's no system alive that could probably run it decently regardless of RTX or not, but it would be nice to see the difference.

So I think this has to be taken into account, at least the extent if Control is being used as an example of DLSS improving. It may very well be, but this may be a one-off case where the fact that Control is inherent blurry due to its reconstruction technique where DLSS just fits perfectly here as it masks the artifacts DLSS can bring. This is not a knock against DLSS, if it works it works here then it works, just saying it may not be an indicator of the technique being significantly improved going forward for other engines.
 
Haven't seen it in motion, but in stills it's still blurry
You certain? DLSS up from 1080p to 4K looks pretty sharp in this game.

IIIRC quantum break had this issue as well; their temporal solution was very blurry IIRC.

This game, unlike, Quantum Break is not using that upscaling from a lower internal resolution by default. So the internal rendering res is the res it says it is.
 
Back
Top