Nvidia DLSS 3 antialiasing discussion

@troyan Reading the article, the conclusions pop out as weirder than the video. In the games that are gpu limited, the input latency of DLSS + frame generation is lower than native rendering with reflex turned on. It's kind of like saying games are only playable if you're running DLSS on. There are a lot of people happily playing their games with DLSS off because of the way DLSS looks in that particular game etc. There are also people happily playing games that don't support reflex at 60 fps, that likely have much higher input latency than the games they tested. Overall, just a weird conclusion.

I think some visual comparisons between DLSS Quality + frame gen on and DLSS performance + frame gen off would help, but I guess that's hard to do right now. The close performance of those modes in some of these games gives you an interesting choice. It may vary title to title depending on how many additional frames you get from frame gen whether it's a good option or not.
 
Last edited:
It is baffling to me at times that the PC industry has so many covering "game performance" with GPUs and CPUs and very few if any are talking about an endemic problem in PC games which greatly affects their presentation, playability, and performance: Shader Compilation stutter. It feels like I am the only one really. Perhaps it has to do with the fact that for my reviews I *actually play* the games extensively.

I regret I have only one like to give to this post.

This has also been such a frustrating omission from these sites, and I don't mean I want Linus/Steve start doing game 'reviews' in terms of story/gameplay. But good lord, at least occasionally address how a popular game is actually functioning on a technical basis though! The software is why we buy the hardware after all.

Understanding exactly how it functions and where exactly are the bottlenecks in a consistent experience is just, if not more important, than discovering CPU hotspots, and understanding these bottlenecks will ultimately lead to better hardware coverage as well. The actual impact to the PC gaming experience of say, not bothering to pour over a 30 CPU cooler review and instead just slapping your new Ryzen into ECO mode OOTB is completely immaterial vs the gameplay experience of dealing with something like shader stuttering, but the former will have 1000X more coverage than the latter. God forbid the PC industry ever switches to Arm and we don't need AOI cooler reviews anymore, 90% of their content would be eradicated.

There was a brief moment of time when game experience and presentation actually was center stage in PC reviews, so why did it go away? I am not sure. Any theories?

I think your nod towards 'scandalization' is a big part of it. I think it's just part and parcel with the youtube monetization model, and just video in general. The "EXPOSED!!' angle just gets more clicks. But hey, it's not like you have to abandon that approach entirely either to focus on these problems, certainly one particular engine that starts with a U has been a prime contributor towards a less than stellar experience. Lets have some campus ambush videos of Epic. :)

I think ultimately at well these types of sites are run by guys who basically want to do Top Gear, but for tech. The appeal is the tinkering aspect, and they're just far more comfortable with hardware vs software. There's plenty of tinkering you can do with software too, but ultimately a video about shader stuttering won't really have any immediate solutions, at least those akin to showing you how to undervolt your CPU for 10 degrees lower temps.
 
Last edited:
I regret I have only one like to give to this post.

This has also been such a frustrating omission from these sites, and I don't mean I want Linus/Steve start doing game 'reviews' in terms of story/gameplay. But good lord, at least occasionally address how a popular game is actually functioning on a technical basis though! The software is why we buy the hardware after all.

Understanding exactly how it functions and where exactly are the bottlenecks in a consistent experience is just, if not more important, than discovering CPU hotspots, and understanding these bottlenecks will ultimately lead to better hardware coverage as well. The actual impact to the PC gaming experience of say, not bothering to pour over a 30 CPU cooler review and instead just slapping your new Ryzen into ECO mode OOTB is completely immaterial vs the gameplay experience of dealing with something like shader stuttering, but the former will have 1000X more coverage than the latter. God forbid the PC industry ever switches to Arm and we don't need AOI cooler reviews anymore, 90% of their content would be eradicated.



I think your nod towards 'scandalization' is a big part of it. I think it's just part and parcel with the youtube monetization model, and just video in general. The "EXPOSED!!' angle just gets more clicks. But hey, it's not like you have to abandon that approach entirely either to focus on these problems, certainly one particular engine that starts with a U has been a prime contributor towards a less than stellar experience. Lets have some campus ambush videos of Epic. :)

I think ultimately at well these types of sites are run by guys who basically want to do Top Gear, but for tech. The appeal is the tinkering aspect, and they're just far more comfortable with hardware vs software. There's plenty of tinkering you can do with software too, but ultimately a video about shader stuttering won't really have any immediate solutions, at least those akin to showing you how to undervolt your CPU for 10 degrees lower temps.
How many likes can I give this post? Not enough😅
 
@Dictator @Flappy Pannus preach on brothers!

If the bigger youtube channels such as LinusTechTips started bringing more attention to PC gaming issues, such as shader compilation stuttering, and actually shined a light on it consistently and periodically... and advised consumers to avoid these games that launch with this egregious issue... we'd be in a MUCH MUCH better place by now.

The thing that kinda pisses me off about it is that Linus has built his channel up on the back of PC building and reviewing/tinkering with hardware that is designed for PC gaming. He proclaims to be a PC gamer and always talks it up over console gaming... but yet he literally does almost nothing to put pressure on game pubs and developers to actually improve the issues. His channel alone has massive influence and you can 100% guarantee that him consistently bringing an issue forward would get a response.

For things to get better... people have to start admitting there's a damn problem to begin with... and it seems like some of these guys don't want to admit there's a serious issue with the platform, which they feel may dissuade the PC gamers that watch their content from doing so? I dunno. I don't like talking trash about my favorite platform either... but I understand that the only way to get things done about it is to call it out every time... and I'll continue to do so because I know it doesn't have to be this way. It can be much better than it is.
 
He only cares about getting the max views he can get for each video. Its all about maxing profits nothing else. Doubt theres much deep care about anything gaming related. There might have in his early days, but when things start growing to the size they are today these things fade quite easily i think.
 
Last edited:
The lack of uproar over shader compilation is probably a byproduct of how diffuse and fragmented the PC games market is. If one were to take the top X number of PC titles on Steam according to player time, how many of those experience shader comp stutter? An awful lot of PC games fall into the buckets of esports or mmorpgs, simulation, survival, management, etc. Esports titles tend to be small enough in scope to not experience these problems, and the latter genres are kind of expected to have erratic performance. If you're using your $1600 GPU chiefly to play the latest AAA console games then you're absolutely in a weird spot -- you're paying for a 1st class ticket, but you're 2nd class customer as far as the developers are concerned, so the chance of getting a 2nd rate experience is always going to be there.
 
If you're using your $1600 GPU chiefly to play the latest AAA console games then you're absolutely in a weird spot -- you're paying for a 1st class ticket, but you're 2nd class customer as far as the developers are concerned, so the chance of getting a 2nd rate experience is always going to be there.

Generally regarding ports. Not those of Nixxes/Sony though, you get some solid improvements i think aimed at those with the HW that is capable enough. Higher fidelity, ultrawide support, dlss/FSR, increased RT and proper resolution and frame rate scaling upwards (yes ports with awkward res/fps scaling did exist). For other regular ports, yeah, it means you usually get higher settings and resolutions/framerates, but not these 'pc exclusive' features directly.
I either do not think that something like a RTX4090 with 24GB of 1tb/s gddr ram is something for the gamer, heck its too damn fast for todays CPU's to begin with. its a product that should see dual-use, creators that like to play games or vice versa. Developers perhaps, enthusiast gamers wanting maxed 8k experiences today.... Someone paying 1600usd for a gpu alone shouldnt either expect perfect tailorment to just their GPU's either.
Most gamers are looking at 3060/Ti class GPU's, which tbh is a very, very nice position for the market to be in. According to steam the most popular GPU.... its ballpark the performance of my 2080Ti which is quite amazing. Above the baseline these AAA ports are coming from. its not like the average gamer needs 4080's and 4090's.... their close to halo products, large huge monsters.
 
Hardware Unboxed review of DLSS3 noted that C2077 hass ~101ms latency without Reflex, and ~63ms with Reflex turned on. DLSS3 boosts framerate 2.5x times from native while retaining the latency at 62ms. But the reviewer said that this is still bad, because the latency did not change for the better. In his opinion Reflex is the only proper starting point for comparison.

Am I mistaken, does C2077 even supports Reflex right now? Or did CDPR just added it to the unreleased DLSS3 version of C2077 that reviewers had early access to?
 
So I was wondering. There are hacks which let you disguise your current GPU as a different GPU in device manager. Maybe if you disguise a Turing or Ampere GPU as a 4090, you could enable DLSS 3 in supported games?
 
That was IMO a weird stance to adopt. Will HWUnboxed now also throw all Radeon users under the bus because of their higher latency? :D
They have since back tracked a little, saying that DLSS3's latency is fine, but that's on Twitter only though. And with their usual flair of arrogance.


It's always fun to watch them fumble and go back and forth with their reasoning, I mean they are fine with sacrificing all of the ray tracing visual improvements for the sake of performance .. but they also refuse DLSS3 and DLSS2 because it sacrifices visual quality a little despite adding tons of performance.

I mean, this double standards dance is ridiculous, you either are a visual quality purist at heart, and don't compromise on visual features (whether resolution or ray tracing), or you are a performance purist, in which case you would accept DLSS3 and DLSS2 as they sacrifice little for a huge performance win and of course discard ray tracing in that case safely. In the end, you have to stick to your guns, but this pick and choose dance is pretty dishonest in my opinion.

Of course, the latency of DLSS3 is nothing but the latest symptom of their double standards.
 
Last edited:
Tried FG in Superpeople - cant feel any difference to off.

Now i need displays with higher Hz. 175hz (144hz with HDR) with 3440x1440 is not enough anymore...

That seems to be a major stumble. Seen a lot of people that really like HFR (as in, omg I need hfr even for work non hfr terrible, etc. etc.) feel that going beyond about 144-165hz really drops off in any appreciable benefit outside VR.

DLSS3 does seem neat, but it seems too limited for many to care much. Needing a $1k+ card to work at all(?), not workable at all on some titles as you really don't want to see it on moving foliage, for the most part it makes latency worse and not better, and works best on boosting games past what most monitors go and what even most hfr fans might care about.

It's cool that it exists, but I'm not sure it's a "killer app" that AMD and Intel will feel compelled to copy like with DLSS2.
 
That seems to be a major stumble. Seen a lot of people that really like HFR (as in, omg I need hfr even for work non hfr terrible, etc. etc.) feel that going beyond about 144-165hz really drops off in any appreciable benefit outside VR.

DLSS3 does seem neat, but it seems too limited for many to care much. Needing a $1k+ card to work at all(?), not workable at all on some titles as you really don't want to see it on moving foliage, for the most part it makes latency worse and not better, and works best on boosting games past what most monitors go and what even most hfr fans might care about.

It's cool that it exists, but I'm not sure it's a "killer app" that AMD and Intel will feel compelled to copy like with DLSS2.

Just wait until BlurBusters posts their zomg amazing article about the benefits to motion resolution and clarity on high refresh displays. That’s in my opinion the only real benefit of DLSS3.

If your “sample and hold” LCD/OLED monitor can do 144Hz but your GPU can only pull 70fps DLSS3 is a very good thing to have.
 
That seems to be a major stumble. Seen a lot of people that really like HFR (as in, omg I need hfr even for work non hfr terrible, etc. etc.) feel that going beyond about 144-165hz really drops off in any appreciable benefit outside VR.

DLSS3 does seem neat, but it seems too limited for many to care much. Needing a $1k+ card to work at all(?), not workable at all on some titles as you really don't want to see it on moving foliage, for the most part it makes latency worse and not better, and works best on boosting games past what most monitors go and what even most hfr fans might care about.

It's cool that it exists, but I'm not sure it's a "killer app" that AMD and Intel will feel compelled to copy like with DLSS2.
FG boost performance in a DLSS quality way. Lovelace users have more choice how to play games. I gladly trade higher latency for better image quality. Playing Spider-Man nativ with DLAA looks much better than with DLSS quality. And FG is boosting frames nearly twice in CPU limited scenarios.

I have played several hours of Spider-Man with DLSS 3 in 5160x2160 (downsampled to 3440x1440) with DLAA and over 100FPS. I can say: It works really good. Especially cut scenes are improved by higher frames. And i find it highly impressiv that you cant really see the generated frames.

FG is a step forward to use >200Hz display.
 
Last edited:
Back
Top