Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

Ignore is bliss
No, sorry, my message was clear since first post. DLSS is a good feature that may help achieving good framerates at resolution which are normally not playable with a given graphic card with a low impact on IQ. So if you need it, use it.
What I'm contesting is the fact someone can say it's even better than native solution "because Jensen says so" when it's a (quite advanced) upscaling but as an upscaling it has its share of issues.

If you refer to me I'm pretty clear on that only use dlss2.0 when you need it or when you like it. Matters of taste often are very personal like the oversharpening artifacts in dlss. Some like them, some really don't. And there really are odd exceptions like death stranding where native sucks. And then there is the brian karis tweets linked. He is one of the ue5 developers. I'm also pretty clear on that temporal algorithms cannot realistically be understood by looking at stills, by nature it requires watching video or playing the game. Things can break or make in movement,...
 
One really shouldn't judge dlss2.0 without playing the games. Lot of the errors don't happen all the time and everywhere. Lot of these comparison pictures are result of hunting for errors/differences. Similarly the native or taa has errors on these kinds of comparisons. Also looking at zoomed in stills is very different experience especially in temporal algorithms versus high framerate moving image. Much rather go by either playing the game or look at digital foundry, pcworld, hardware unboxed etc. sites and draw conclusion based on their reviews. Lot of these reviewers recommend turning dlss on in minecraft, control,... as it is the best option considering perf+quality. It's always a compromise and not everyone will prefer same compromise. Some games dlss2.0 doesn't work out well and that is also clearly visible in reviews(f12020 dlss sucks).

I would also tend to trust comments like this from developers


The comments are almost all against TAA for its blurriness, no matter how many samples it can accumulate. It seems to be a decent solution for consoles where the picture is too far away, but on PCs with monitors it looks downright horrible.

In RDR2, it is like a shitty watercolor. Thankfully Rockstar provide a MSAA option which looks much much better with a good performance hit of course.
 
The comments are almost all against TAA for its blurriness, no matter how many samples it can accumulate. It seems to be a decent solution for consoles where the picture is too far away, but on PCs with monitors it looks downright horrible.

In RDR2, it is like a shitty watercolor. Thankfully Rockstar provide a MSAA option which looks much much better with a good performance hit of course.

msaa is going the way of dodo bird. Some games still can use it but its future is not bright.

 
The comments are almost all against TAA for its blurriness, no matter how many samples it can accumulate. It seems to be a decent solution for consoles where the picture is too far away, but on PCs with monitors it looks downright horrible.

In RDR2, it is like a shitty watercolor. Thankfully Rockstar provide a MSAA option which looks much much better with a good performance hit of course.

If you're gaming at lower resolutions (esports) TAA just looks downright terrible in the majority of games. Fidelity CAS or Nvidia Sharpening are pretty much a requirement to compensate for the blur if you go the TAA route.
 
msaa is going the way of dodo bird. Some games still can use it but its future is not bright.


That might be so, but you should stop quoting that twitter thread since I confused it with the previous page where the talk was of IQ and you had quoted the very same thing.

If you're gaming at lower resolutions (esports) TAA just looks downright terrible in the majority of games. Fidelity CAS or Nvidia Sharpening are pretty much a requirement to compensate for the blur if you go the TAA route.

Well, 1440p is a high enough resolution, I don't remember trying many games at 4k, but might check later if it really helps

Now this game could've used some DLSS for sure.

It should help. But I'm not sure if this was R*'s first attempt at TAA and they botched it, but it looks really bad.
 
That might be so, but you should stop quoting that twitter thread since I confused it with the previous page where the talk was of IQ and you had quoted the very same thing.

I get your frustration about my linking tweets. Though it's completely different tweet from same person. I try to backup my claims with sources instead of relying because I said so arguments.
 
What's the original? No AA or TAA? We're talking about PC where user settings are configurable. Some people hate TAA and turn it off. TAA can have significant ghosting and blur depending on the implementation, but textures will always look sharper with no AA than TAA, but then you get aliasing. So what's "the original" and which is "native"?

No TAA, and if you get significant Aliasing nowadays at 4K, it means a game programmed with the arse.
 
The comments are almost all against TAA for its blurriness, no matter how many samples it can accumulate. It seems to be a decent solution for consoles where the picture is too far away, but on PCs with monitors it looks downright horrible.

In RDR2, it is like a shitty watercolor. Thankfully Rockstar provide a MSAA option which looks much much better with a good performance hit of course.

RDR2 had a sharpening option though that worked pretty well IMO.
 
No TAA, and if you get significant Aliasing nowadays at 4K, it means a game programmed with the arse.
WHAT? aliasing is pretty much an inherent problem in all digital signals, any game that doesn't use a form of AA will look like crap even at 8K with massive amounts of flickering and shimmer of pixels, not to mention shader aliasing and pixel crawls.

Digital Foundry is not the only outlets that praised DLSS 2, Hardware Unboxed did the same, Overclock3D, Computerbase, DSOGaming, GamersNexus and many others. Go have a look.
 
WHAT? aliasing is pretty much an inherent problem in all digital signals, any game that doesn't use a form of AA will look like crap even at 8K with massive amounts of flickering and shimmer of pixels, not to mention shader aliasing and pixel crawls.

Digital Foundry is not the only outlets that praised DLSS 2, Hardware Unboxed did the same, Overclock3D, Computerbase, DSOGaming, GamersNexus and many others. Go have a look.

Lol. They have gave DLSS a praise for performance increase vs image quality loss (so do I, if it's not clear), non for being better IQ than anything else. I gave a look - you hear only what you want to hear.
I know about aliasing, but aliasing is a problem that can be solved in many ways. Not simply DLSS or maybe other not proprietary examples of ML Antialiasing (and in the thread our friend Manux linked there are other solution than temporal accumulation of samples...). One example is using different shading engines/methods than traditional forward or deferred rendering. I.e. why building a system that accumulated temporal sampling if the game engine already has/can calculate these data already? I.e. there are proposals about frame rate amplification and object space rendering.
 
Last edited:
WHAT? aliasing is pretty much an inherent problem in all digital signals, any game that doesn't use a form of AA will look like crap even at 8K with massive amounts of flickering and shimmer of pixels, not to mention shader aliasing and pixel crawls.

Digital Foundry is not the only outlets that praised DLSS 2, Hardware Unboxed did the same, Overclock3D, Computerbase, DSOGaming, GamersNexus and many others. Go have a look.

I would add pcworld to this. I really enjoy their full nerd podcasts and written reviews. They do really nice and fairly unbiased reviews. They stomped all over turing launch for ray tracing not being available and later on dlss1.0 being shit. Now they have turned around.
 
When you stop with the "better IQ than anything else" crap. We have eyes, too. And btw I said "because Jensen says so"

Then we don't see the same things in some games ? Why say "no you're wrong !" when someone says here "in game x ou y I prefer dlss2.0 vs native, it looks better". You seem very irritated by that... Some bad memory with a leather jacket involved ?
 
No TAA, and if you get significant Aliasing nowadays at 4K, it means a game programmed with the arse.

????
Lol. They have gave DLSS a praise for performance increase vs image quality loss (so do I, if it's not clear), non for being better IQ than anything else. I gave a look - you hear only what you want to hear.
I know about aliasing, but aliasing is a problem that can be solved in many ways. Not simply DLSS or maybe other not proprietary examples of ML Antialiasing (and in the thread our friend Manux linked there are other solution than temporal accumulation of samples...). One example is using different shading engines/methods than traditional forward or deferred rendering. I.e. why building a system that accumulated temporal sampling if the game engine already has/can calculate these data already? I.e. there are proposals about frame rate amplification and object space rendering.

Are you really saying TAA is bad because of object space rendering and "frame rate amplification", when there isn't a single game commercially available (as far as I'm aware) that does either of these things?

Also, what does object space shading have to do with temporal accumulation? Is fame rate amplification supposed to be temporal upscaling from 30->60fps kind of th ing? If you think dlss has artifacts ...
 
Then we don't see the same things in some games ? Why say "no you're wrong !" when someone says here "in game x ou y I prefer dlss2.0 vs native, it looks better". You seem very irritated by that... Some bad memory with a leather jacket involved ?

Because there are obvious artifacts and blurring in several places, as I've shown with real examples when in the opposite you are bringing absolutely nothing of the sort except "I heard so", "jensen says so", or even "reviews say so" when they praise the good compromise of performance and IQ but they don't say "it looks better than everything else". I am not irritated, I am astonished someon can repeat this mantra again when there are obvious examples of the opposite.
As said, I find it as a very good compromise and I see no practical problems in activating it should you need it. Oh, and btw I already have, on Control. A game programmed with the arse, performance wise.
 
If you're gaming at lower resolutions (esports) TAA just looks downright terrible in the majority of games. Fidelity CAS or Nvidia Sharpening are pretty much a requirement to compensate for the blur if you go the TAA route.
You mean RIS aka Radeon Image Sharpening, CAS is separate thing implemented by gamedevs, RIS something you can add yourself like NVIDIA Image Sharpening
 
????


Are you really saying TAA is bad because of object space rendering and "frame rate amplification", when there isn't a single game commercially available (as far as I'm aware) that does either of these things?

Also, what does object space shading have to do with temporal accumulation? Is fame rate amplification supposed to be temporal upscaling from 30->60fps kind of th ing? If you think dlss has artifacts ...

I am saying that TAA has its issues, like other AA methods and that there are other methods of antialiasing that can be integrated directly in the game engine, that DLSS is not flawless, other than being also quite new, proprietary and implemented in a handful of titles.
 
Last edited:
Back
Top