Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

I don't have a 1440p monitor to test, but the internal resolution of 1440p performance mode and 1080p quality mode are both 1280x720. Does that mean they perform equally? Like if you were to talk the same system and benchmark both in the same game, would you get the same performance? I know the DLSS pass probably costs a bit more on the 1440p output, but I'm wondering how much. Should be hidden a bit if games start supporting async dlss. I'm also a bit curious from an image quality perspective if 1440p DLSS performance looks better or worse than 1080p DLSS quality. They have the same internal resolution to work from, but does the DLSS start to break down the further you try to reconstruct. I'm thinking of getting a 1440p monitor in the future, but there will be some games where I just want to go performance mode, imagine quality be damned, because I want frames. Just trying to think if it's better to stay at 1080p and get a 240hz monitor or go 1440p and get a 144 or 240. 1440p240 is looking like it's going to be pretty much a pipe dream with the exception of some really old games. Even 1440p144 is looking like a stretch for anything next-gen unless I'm playing on low settings.
 
I don't have a 1440p monitor to test, but the internal resolution of 1440p performance mode and 1080p quality mode are both 1280x720. Does that mean they perform equally? Like if you were to talk the same system and benchmark both in the same game, would you get the same performance? I know the DLSS pass probably costs a bit more on the 1440p output, but I'm wondering how much. Should be hidden a bit if games start supporting async dlss. I'm also a bit curious from an image quality perspective if 1440p DLSS performance looks better or worse than 1080p DLSS quality. They have the same internal resolution to work from, but does the DLSS start to break down the further you try to reconstruct. I'm thinking of getting a 1440p monitor in the future, but there will be some games where I just want to go performance mode, imagine quality be damned, because I want frames. Just trying to think if it's better to stay at 1080p and get a 240hz monitor or go 1440p and get a 144 or 240. 1440p240 is looking like it's going to be pretty much a pipe dream with the exception of some really old games. Even 1440p144 is looking like a stretch for anything next-gen unless I'm playing on low settings.

tvs can do some 120hz up sampling which is usually hit or miss, but if the game incorporates that at the render level (I believe a Star Wars game force unleashed tested it on PS3 back in the day), we could see AI framerate up sampling as well within a few years. They would only need to clean the in between artifacts which I believe the systems could be trained for.

I recently started gaming and viewing UHD blu rays on a 120 inch projector while in lockdown and I am pretty embarrassed about the fact that it turns out the projector only has a native 1080P chip inside. I thought it was 4K because the HDMI connection pop up said so... At 4 meters I should have seen it was not 4K but the image quality was so good and the images perfectly sharp without pixelation that I am starting to wonder if (native) resolution above 1080P is severely overrated.

movies are essentially super sampled so those do not count, but for games like Call of Duty infinite warfare which I played, the image was perfectly clear even though it does not render near 4K at all.

I fully applaud DLSS and other upsampling techniques as it would be the biggest waste to render anything at 4K, IMO
 
A challenge with frame rate up sampling would be how to keep input lag and user "connectedness" (not sure how else to phrase it) in check. I know you mention that it was done before once but the expectations of those two aspects for PC gaming is different than consoles (it's only somewhat recently that there's a shift to >30fps, and awareness of TV input lag due to settings, achieving wide awareness for the console user base) due to a variety of factors.

Also interestingly (and somewhat related to what's going on in this discussion) if/when something like that comes to market it's going to be even more tricky to compare online as you're adding another vector, in this case user input, to an information medium that relies primarily on static comparisons.
 
I don't have a 1440p monitor to test, but the internal resolution of 1440p performance mode and 1080p quality mode are both 1280x720. Does that mean they perform equally? Like if you were to talk the same system and benchmark both in the same game, would you get the same performance? I know the DLSS pass probably costs a bit more on the 1440p output, but I'm wondering how much. Should be hidden a bit if games start supporting async dlss. I'm also a bit curious from an image quality perspective if 1440p DLSS performance looks better or worse than 1080p DLSS quality. They have the same internal resolution to work from, but does the DLSS start to break down the further you try to reconstruct. I'm thinking of getting a 1440p monitor in the future, but there will be some games where I just want to go performance mode, imagine quality be damned, because I want frames. Just trying to think if it's better to stay at 1080p and get a 240hz monitor or go 1440p and get a 144 or 240. 1440p240 is looking like it's going to be pretty much a pipe dream with the exception of some really old games. Even 1440p144 is looking like a stretch for anything next-gen unless I'm playing on low settings.
I did a quick test.

1080P + Quality -> 78 FPS
1440P + Performance -> 78 FPS
2160P + Ultra -> 73 FPS

I think the base resolution is 720P in all three, but there is some performance cost converting to 2160P. I am pretty close to CPU limits though. I only render the scene at 87 FPS at native 720P and DLSS doesn't help performance at that resolution.

I think the best route is to go from render resolution to native resolution of your monitor using DLSS. If your monitor is 4k and you use DLSS Quality at 1440P then somewhere that 1440P image is being upscaled using a dumb upscale method to 2160P. I would always target the native resolution of your monitor using DLSS. Just change the level of DLSS to reach your desired performance.

I wish I could see what their DLSS model looks like. The output near high contrast edges just doesn't look right.

[QUOTE="Scott_Arm]Ultra Performance really doesn't look that bad considering. It'll probably become a nice option to have with DLSS enabled games further into this generation of games.[/QUOTE]
I should clarify that the high amount of shimmering that happens with ultraperformance is very noticeable when looking at shrubbery with lots of high contrast edges that is swaying in the wind.
 
Judging by the Cyberpunk reports (no personal experience) I wonder if a hybrid approach to training models might be benefit going forward? I know DLSS 2.0 moved away from per game training but was that to a single model or multiple models? At least judging by non game image upscaling there is some gains in terms of having varied models for differing types of images. Is something like Cyberpunk enough of a visual outlier that it might benefit from a different model? Would a major title like Cyberpunk be feasible to have it's own model even?

That's kinda the whole point of DLSS. Using DLSS while also rendering in subnative resolution of you display makes very little sense.

Although I wonder if it'd be interesting to do a comparison between something like 4k DLSS vs. 1080p DLSS + integer scaling to 4k.
 
That's kinda the whole point of DLSS. Using DLSS while also rendering in subnative resolution of you display makes very little sense.
Do you mean "while scaling to subnative resolution of your display"? Because per definition, DLSS lives by using sub-selected resolution for rendering, which in most cases equals sub-native resolution, and scaling it back up to native resolution. If not, I don't understand what you mean by that post.
 
Do you mean "while scaling to subnative resolution of your display"? Because per definition, DLSS lives by using sub-selected resolution for rendering, which in most cases equals sub-native resolution, and scaling it back up to native resolution. If not, I don't understand what you mean by that post.
You use DLSS to output to your monitor resolution, not some lower resolution which is then upscaled by the display.
So for example if you have a 4K screen you use 4K with DLSS UP which will render in 1080p but reconstruct to 4K instead of using, for example, 1440p with DLSS B which will render in the same 1080p and reconstruct to 1440p which will then upscale by the display to 4K.
This is what the post I've replied to suggested as well - but I don't see why anyone would use the second option.
 
You use DLSS to output to your monitor resolution, not some lower resolution which is then upscaled by the display.
So for example if you have a 4K screen you use 4K with DLSS UP which will render in 1080p but reconstruct to 4K instead of using, for example, 1440p with DLSS B which will render in the same 1080p and reconstruct to 1440p which will then upscale by the display to 4K.
This is what the post I've replied to suggested as well - but I don't see why anyone would use the second option.
Exactly because you're not rendering in native resolution with DLSS, you're upscaling to it.
WRT the 2nd option (output with DLSS so a sub-native resolution): Maybe because of general performance considerations? I haven't tried, but maybe it's more consistent and nicer to have your RTX 2060 work towards a 1440p resolution with DLSS quality than upscale to 2160p with DLSS performance or ultra performance?
 
Exactly because you're not rendering in native resolution with DLSS, you're upscaling to it.
I suggest using "reconstructing" in this case (and in case of checkerboarding, temporal supersamling, etc) to differentiate this from "upscaling" which is what monitor scalers, GPU drivers and something like Fidelity CAS do.
The processes are different enough in both the approach and the resulting IQ to not pile them all up into "upscaling".

WRT the 2nd option (output with DLSS so a sub-native resolution): Maybe because of general performance considerations? I haven't tried, but maybe it's more consistent and nicer to have your RTX 2060 work towards a 1440p resolution with DLSS quality than upscale to 2160p with DLSS performance or ultra performance?
Nah, it will most definitely look worse and while it may perform better the difference is unlikely to be huge.
This however is something which some media outlet should probably look into in detail.
 
I suggest using "reconstructing" in this case (and in case of checkerboarding, temporal supersamling, etc) to differentiate this from "upscaling" which is what monitor scalers, GPU drivers and something like Fidelity CAS do.
Then I in turn suggest not using "rendering in native" in this context, because that's not what you're/DLSS is doing, because that's the whole point of DLSS (apart from finding a gaming use for Tensor cores): decreasing rendering resolution in order so output higher fps.
 
DLSS will still produce good, 'hard' edges, and pixel level detail. When you output below native res on a fixed pixel display, it will never look as good. I have never seen a display with good scaling. TVs can sometimes do it very well, but it will be at the cost of a few additional frames of input latency
 
https://www.reddit.com/r/Crysis/comments/keygd6/crysis_remastered_pc_update_20/gg5bz06/ said:
Just to follow up,

we want to thank you all for the feedback provided please keep it coming as it really helps us to ensure the next patches are even better. You might also like to know that DLSS will be also part of our next major PC update coming in the new year.

Thanks.

~The Crysis team.
 
One of the bad aspects of DLSS, it often comes way too late for most of the players.
 
One of the bad aspects of DLSS, it often comes way too late for most of the players.

It would be interesting to see sales trends over time. Some games definitely front load sales closer to launch but Witcher 3 for example is still a big seller.
 
It would be interesting to see sales trends over time. Some games definitely front load sales closer to launch but Witcher 3 for example is still a big seller.

From an user experience impact stand point ongoing play count and hours would be an important factor (if not more) to consider.

But I'm not sure if the general premise that launch players mostly miss out on DLSS holds true, especially going forward with the technology and development pipeline more mature. If we look at 2020 release titles 9 out of 11 (well 9/12 now with the Crysis Remaster announcement) had DLSS support at launch. The two that didn't were Avengers and F1, both about 1.5 months after launch.

It's worth keeping in mind that DLSS 2.0 wasn't officially revealed until Mar 2020. So the development/adoption timeline prior to 2020 likely isn't as comparable to that going forward.
 
Back
Top