Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

An issue is that the advantages of temporal related gains are basically impossible to really convey or compare unless you effectively physically have the audience essentially "on site" so to speak. Since most (really almost all) discussion/content is purely online based this effectively just defaults to static comparisons as the path of least resistance.

Yah, it's true. Unless people have 120Hz displays, you can't see the benefits of 100 fps over 60 or 70 fps etc. I find reviewers are very focused on just getting 4k60 on ultra settings and have little to no interest in the perspectives of esports that want 1080p240+ or 1440p240. I think 1080p144 is probably the most common pc display, if I had to guess, so I'm not sure what the infatuation with 4k60 is.
 
Yah, it's true. Unless people have 120Hz displays, you can't see the benefits of 100 fps over 60 or 70 fps etc. I find reviewers are very focused on just getting 4k60 on ultra settings and have little to no interest in the perspectives of esports that want 1080p240+ or 1440p240. I think 1080p144 is probably the most common pc display, if I had to guess, so I'm not sure what the infatuation with 4k60 is.
Probably a mix of 4k60 PC monitors being much cheaper than 4k120, and 4k60 TVs being much more common than 4k120. Even my Samsung TV from 2015 supports 4k60 on one input, and you can bet I have used it.
 
It probably pays off to find reviewer with same taste as one has and then trust that source. Understanding and noticing biases makes it easier to parse the content as a consumer. For me anything above 60fps is good, vrr takes care of possible dips and I just want to add eye candy. I don't care anymore for competitive multiplayer gaming. It's all in the single player. Someone else might be all about competitive gaming, increasing fps, reducing eye candy in favor of more easily seeing enemies etc.

I played so much unreal tournament and quake2 that I just can't take anything like that anymore. Destiny, cod, dota, lol,... I have absolutely no interest on.
 
Ok. I'll spill the beans a little bit. Two of them are 4K, one has some form of AA (I will reveal what it is later), the other one does not, and the other is DLSS.
It smells awfully lot like jebait, but 1) TAA 2) no AA 3) DLSS.
The 4K No AA looks awfully bad though, do you know if it's really 4K through-and-through or are parts of the rendering process done at lower resolutions?
 
Cyberpunk 2077, Minecraft with RTX, and 4 New Games Add NVIDIA DLSS This December
December 8, 2020
Minecraft with RTX is now out of beta and officially released for Windows 10, introducing stunning path-traced ray tracing and NVIDIA DLSS to the world’s most popular game.

Cyberpunk 2077, the year’s most anticipated game, launches on December 10th with NVIDIA DLSS and several ray-traced effects that enhance image quality, immersion and realism.

And with the release of a new update coming this month, Mount & Blade II: Bannerlord will feature NVIDIA DLSS, accelerating performance by up to 50% at 4K,

Darkflow Software’s CRSED: F.O.A.D. (formerly Cuisine Royale) is a free to play multiplayer last-man-standing shooter with realistic weaponry, mystic traps and demonic rituals,. In an update released last week, CRSED: F.O.A.D. (Free to Play) added NVIDIA DLSS support, boosting performance by up to 40% at 4K, allowing gamers to hit 90+ FPS across all GeForce RTX GPUs:

Midwinter Entertainment’s Scavengers is a free-to-play strategic shooter where players form squads of three to fight for survival and dominance in a hybrid of sandbox-style PVE and class-based PVP. The game is currently running tech tests and has recently added DLSS, which on average can boost performance by up to 40% at 4K.

Now, Moonlight Blade has added support for NVIDIA DLSS, which can more than double your framerate at 4K with ray tracing and DLSS on.
Cyberpunk 2077, Minecraft with RTX, and 4 New Games Add NVIDIA DLSS This December
 
Does Minecraft RT support RDNA2 then?

Yeah 6900xt is around half the 3090’s perf according to LTT.

6085680-C-99-C0-4-E1-F-BCE0-06-A95-A561-DF1.png
 
Yeah 6900xt is around half the 3090’s perf according to LTT.

And thats without DLSS, which should be included, as DF noted in their latest video. Its hard enough for people to say which screens are with DLSS on/off when provided, that good it is. If it boosts performance that much, why not use it if supported?
I have said it before, DLSS is just as, if not more so, important then RT. Or better said, they go hand in hand, the perf RT takes away DLSS can equate for. Its tensor hardware on the die space should not go unused either.
 
And thats without DLSS, which should be included, as DF noted in their latest video. Its hard enough for people to say which screens are with DLSS on/off when provided, that good it is. If it boosts performance that much, why not use it if supported?
I have said it before, DLSS is just as, if not more so, important then RT. Or better said, they go hand in hand, the perf RT takes away DLSS can equate for. Its tensor hardware on the die space should not go unused either.

Maybe it's included later on the review ? A straight rt vs rt is good too imo, just for curiosity.
 
Last edited:
And thats without DLSS, which should be included, as DF noted in their latest video. Its hard enough for people to say which screens are with DLSS on/off when provided, that good it is. If it boosts performance that much, why not use it if supported?
I have said it before, DLSS is just as, if not more so, important then RT. Or better said, they go hand in hand, the perf RT takes away DLSS can equate for. Its tensor hardware on the die space should not go unused either.
It definitely should NOT be enabled. Reviews are supposed to be objective with even playing ground for everyone, not subjective with different settings for each card with features affecting image quality, rendering resolution etc.
How good or bad DLSS is is subjective, but it is objectively always worse than native even if you happen to prefer it over native.
 
And thats without DLSS, which should be included, as DF noted in their latest video. Its hard enough for people to say which screens are with DLSS on/off when provided, that good it is. If it boosts performance that much, why not use it if supported?
I have said it before, DLSS is just as, if not more so, important then RT. Or better said, they go hand in hand, the perf RT takes away DLSS can equate for. Its tensor hardware on the die space should not go unused either.
It's quite simple, so RT performance can be directly compared. Of course gamers would use DLSS in the actual game and there are many reviews showing both DLSS on and off comparisons mixed in. No one is going to turn off DLSS if available when using RT unless they specifically don't like it and are fine with the performance with it off.

This argument needs to stop appearing here, there's nothing wrong with directly comparing RT performance in benchmarks.
 
It definitely should NOT be enabled. Reviews are supposed to be objective with even playing ground for everyone, not subjective with different settings for each card with features affecting image quality, rendering resolution etc.

I think what you mean is that both should be tested. Reviews are not about even playing grounds. A good review represents a potential buyer’s actual real world experience using the product.

How good or bad DLSS is is subjective, but it is objectively always worse than native even if you happen to prefer it over native.

We’ve been over this ad nauseum. DLSS off isn’t the same as native. So it’s incorrect to say DLSS on is objectively worse than DLSS off.
 
@PSman1700 said "DLSS should be included", not "native should be replaced by DLSS".

Gamers Nexus, Hardwareluxx and some other reviewers have been doing this with the recent reviews and I don't see the why they shouldn't.
If it's in clearly separate section, sure, but "4K" DLSS doesn't belong anywhere near the same graph as 4K results
 
If it's in clearly separate section, sure, but "4K" DLSS doesn't belong anywhere near the same graph as 4K results

I agree on this. dlss and other scaling mechanism should be handled separately. It is important to detail both the positive and negative those algorithm produce and make a subjective call on if the scaling mechanisms are useful or not. I tend to follow those reviewers who have same tastes as I have.
 
Back
Top