Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

It pretty much never looked worse than basic monitor upscaling when normalized for performance. HUB was the only source which made such nonsense claims.
What it did is introduce an obvious "ML hallucination" style into the upscaled image which may not have been to everyone's liking.
I think you are forgetting how bad DLSS 1 was.


It didn't even manage to look better than a 1440p standard upscale despite being much slower.
 
I think you are forgetting how bad DLSS 1 was.


It didn't even manage to look better than a 1440p standard upscale despite being much slower.
That's one game and the first one to get both DLSS and RT. Again these claims can't be made universally based on such limited sample size. On average it did manage to look better than a standard upscale while being slower of course. And HUB isn't a source I consider trustworthy in these things.
 
That's one game and the first one to get both DLSS and RT. Again these claims can't be made universally based on such limited sample size. On average it did manage to look better than a standard upscale while being slower of course. And HUB isn't a source I consider trustworthy in these things.
Do you think they are tampering with the footage/lying about what footage is what?
 
Do you think they are tampering with the footage/lying about what footage is what?
I think that when their analysis is telling me something which I know is wrong or which my own eyes don't see then I can't trust their analysis. They are fine for CPU/MB and monitors reviews but their GPU coverage for the last couple of years was seriously lacking.

You also seem to have missed the part about making generalized claims based on one sample.
 
I think that when their analysis is telling me something which I know is wrong or which my own eyes don't see then I can't trust their analysis. They are fine for CPU/MB and monitors reviews but their GPU coverage for the last couple of years was seriously lacking.

You also seem to have missed the part about making generalized claims based on one sample.

There weren't many samples during that time frame. Metro was also bad but a patch made DLSS at least equal to the built in upsampling. It was just those 2 and FFXV for a while. I believe the next game was half a year or so later when Tomb Raider added it in. It wasn’t any better there either. What DLSS 1 titles showed the technology in a positive light?
 
A selection of gaming press articles that say differently doesn't prove your point. Other press, and posters on technical forums, were not impressed. DLSS 1.9 and 2.0 were Nvidia's response to the general shared disappoint in 1.0.




Good grief :LOL: Our recollections clearly differ greatly.

In which case it should be easy to link counterexamples, right? I mean at least he took the effort to link gaming press articles to back up his claim. He even asked for people to link to gaming press articles with opinions that differed from the articles he linked to in the time frame he's discussing because he couldn't find any.

I honestly have no idea if there are or aren't as I don't care enough to look. But considering he provided links to back up his narrative, and you haven't... Well, it should be easy for you to find some, right?

Instead of being snarky, just link some counter-examples to prove him wrong.

Regards,
SB
 
There weren't many samples during that time frame.
So maybe you should look beyond that time frame and beyond HUB?

Metro was also bad but a patch made DLSS at least equal to the built in upsampling. It was just those 2 and FFXV for a while. I believe the next game was half a year or so later when Tomb Raider added it in. It wasn’t any better there either.
Do the research.

What DLSS 1 titles showed the technology in a positive light?
Depends on what is "positive light" to you.
DLSS1 was better in quality than simple upscaling (+TAA) while being a bit slower (these are still true for DLSS2 but the difference in quality became a lot bigger).
It allowed to use RT on h/w and resolutions which wouldn't be usable otherwise.
FFXV and SOTTR implementations were okay, BFV and ME were worse. Beyond these 1.0 was used in Anthem and MH World but I don't have these so can't say anything about them - don't remember seeing them analyzed anywhere either.
 
So I was wondering about something:

Are we really sure DLSS 1.9 ran entirely on the shader cores? I remember running nSight in 2019 with Control and once DLSS 1.9 was activated, it did actually use the tensor pipeline.

So I am a bit confused. If it really was not using tensor cores, then Nvidia could make it available on older cards and AMD, a DLSS Lite so to speak as a direct answer to FSR. Would look a ton better than FSR I imagine.
 
So I was wondering about something:

Are we really sure DLSS 1.9 ran entirely on the shader cores? I remember running nSight in 2019 with Control and once DLSS 1.9 was activated, it did actually use the tensor pipeline.

So I am a bit confused. If it really was not using tensor cores, then Nvidia could make it available on older cards and AMD, a DLSS Lite so to speak as a direct answer to FSR. Would look a ton better than FSR I imagine.
Nvidia routes all FP16 calculations through tensors so perhaps that was what you were seeing. Just a guess.
 
So I am a bit confused. If it really was not using tensor cores, then Nvidia could make it available on older cards and AMD, a DLSS Lite so to speak as a direct answer to FSR. Would look a ton better than FSR I imagine.
DLSS is a feature of new h/w, they use it to sell new GPUs. If they wanted to make something like FSR capable of running on any GPU out there they would made it years ago.
Another point is that they promote DLSS as a way to keep RT performance hit minimal - and RT isn't really available on older GPUs.
But yeah it would be cool if they would make some NRDLSS (with NR standing for "not really") capable of running on all GPUs on the market.
 
DLSS is a feature of new h/w, they use it to sell new GPUs. If they wanted to make something like FSR capable of running on any GPU out there they would made it years ago.
Another point is that they promote DLSS as a way to keep RT performance hit minimal - and RT isn't really available on older GPUs.
But yeah it would be cool if they would make some NRDLSS (with NR standing for "not really") capable of running on all GPUs on the market.

Nvidia kind of tried it with dlss1.0 which didn't have temporal component. DLSS2.0 Similar to how ue5, cp2077 and many other games/engines including consoles use temporal algorithms to recover details.

Will be interesting to see if amd has figured out something nobody else did so far. Maybe long term amd will add a temporal component to improve FSR.

This is probably worst case. I hope FSR is better than this

 
I think that what Nv should do instead is just port DLSS to DirectML. This should make it compatible with any DML compatible GPU out there while keeping the performance advantage on those with dedicated ML h/w.
Not sure that current DML is able to handle such port though.
 
Back
Top