Nvidia DLSS 3 antialiasing discussion

It wouldn't matter. The question is, is DLSS3 a feature people would pay extra for or not.

I'd certainly pay more towards the card for the feature. I'd never pay a subscription fee for a particular feature though.

Essentially, we already have this choice (or will soon). A 4080 12GB will for the most part just be a 3080 Ti with frame generation capability. That said we may yet see it stretch it's legs in RT heavy titles, particularly if SER and the other RT exclusive features are used.
 
Exactly! They never ever made a point that NVIDIA GPUs have this feature that reduces latency in dozens of games, a feature that is not available to AMD GPUs at all.

Furthermore, the inclusion of DLSS3 will accelerate the adoption of the Reflex feature immensely, which will increase the advantage of NVIDIA GPUs, yet no body will talk about this, uncless in the context of DLSS3.

Instead we have stuff like this.


Hardware Unboxed does, at times, post inflammatory youtube thumbnails toward Nvidia. But can you really blame them?

Nvidia directly tried to screw with Hardware Unboxed's entire business like a year ago just because they found that HU wasn't focusing on RTX marketing points in their videocard coverage. Nvidia only reversed course because they were shamed and pressured by other youtubers.

Even after all of that, believe it or not, Hardware Unboxed is a great benchmarking and gaming PC review channel. Much like EVERY other tech youtube channel, they interpret their results and then frame their content in a way they would like it to be perceived by their audience. This is the same thing that Gamers Nexus does, DF, Linus Tech Tips and so on. Anyone who writes and reads from a script is framing the content in the way they would like it to be perceived and there will inherently be bias within that.

I watched Digital Foundry's DLSS 3 coverage, and I watched Hardware Unboxed DLSS 3 coverage. They were both very good videos and ultimately they said very similar things about the pros and cons of the technology.

DF was speaking about the tech in a more pure technological/'siggraph' style while HU spoke about the tech a touch more practically and presented a simulation of how it would run on middle of the stack RTX40 hardware.

Individually, the reviewers of the technology have different levels of tolerance or forgiveness about its shortcomings but those are personal preferences anyways.

In regards to reviewers talking or not talking about input lag...its still a relatively new area of focus within PC gaming. Tech like Nvidia reflex has only been around for 2 years. So, before then, for the longest time game engine input latency on PC was either tied to proper in-game framerate caps (which are few and far between) or high framerates which essentially meant smoother and faster response.

So, here we are today, a ton of new SDKs, visual settings, display/monitor tech, raytracing, a 3rd GPU manufacturer and more things to focus on so what is a PC reviewer to do? Well, DLSS 3 has put a spotlight input latency with its mandatory reflex inclusion so input latency has become part of the conversation.

I hope that helps to explain why reflex and input latency has jumped to the forefront of the conversation within this generation of reviews and why HU does what they do while still being able to remain a good review channel.
 
I don't understand. Are you praising the proprietarity of Reflex?
Something wrong with that? AMD has their proprietary Boost/Anti-Lag.
On the same note I think it's a good that Optix is a proprietary product for use by procreators.
 
Last edited by a moderator:
So, here we are today, a ton of new SDKs, visual settings, display/monitor tech, raytracing, a 3rd GPU manufacturer and more things to focus on so what is a PC reviewer to do? Well, DLSS 3 has put a spotlight input latency with its mandatory reflex inclusion so input latency has become part of the conversation.
Input latency with DLSS3 is still lower than AMD running native res. They should make that abundantly clear. Reflex with DLSS3 is better than no Reflex at all, which happens to be all of AMD GPUs and all of NVIDIA GPUs if the user doesn't switch Reflex on (many people don't by the way, as they don't know the tech even exists).
I don't understand. Are you praising the proprietarity of Reflex?
Both AMD and NVIDIA have latency reducing tech, and all are proprietary. AMD has Anti-Lag, which decreases latency by a mild amount, while NVIDIA has three actually, Low Latency and Ultra Low Latency, those two also mildly decrease latency, and then there is Reflex, which is the most potent among them and reduces latency by a VAST amount, it also boosts GPU clocks to the max during CPU limited scenes. AMD has no match for Reflex for 2 years straight. This is not NVIDIA's fault. Reflex just happens to be the latest among a long line of software innovations from NVIDIA, stuff like Ansel, Shadowplay Highlights, FreeStyle Game Filter and Auto Game Optimizations .. none of these stuff has any kind of answer from AMD, doesn't mean they are bad because they happen to be "proprietary".
 
Last edited:
Preferably the concepts of Reflex/Boost/Anti-Lag should be standardized into the API (DX/Vulcan), right? Anything to lessen the cross-development efforts of developers should be the target. That's how us gamers will get a better experience.

Radeon Boost and Anti-Lag are completely different technologies and as far as I know just operate at the driver level. Radeon Anti-Lag is similar to, but came before, Nvidia Ultra Low Latency. Radeon Boost does some weird thing where it lowers resolution while the camera is moving, which is why no one uses it.

As far as I know Nvidia Reflex is the only one that requires an SDK and engine integration. Hopefully there's eventually some cross-platform alternative that works with nvidia, amd and intel, because it's the only solution that's actually effective at reducing input lag when gpu limited.
 
As for the 'Hardware unboxed is showing bias by only focusing on latency now' argument because they didn't cover Nvidia Reflex heavily before in reviews, I think it's reasonable to bear in mind that Reflex was still relatively sparsely adopted in games before now. That will obviously improve significantly with DLSS3 support going forward - it's a great feature even for cards that can't run frame generation! I was pleased when I heard that it's part and parcel with DLSS3 even though I probably won't have a card that use frame generation anytime soon, precisely because Reflex is so rarely supported in games atm.

So back to HU's supposed Reflex blindfold, I looked at their last pre-4090 review, which was Arc. Their game test suite consisted of:

  • Rainbow Six Siege
  • F1 2021
  • Horizon Zero Dawn
  • Watch Dogs Legion
  • Shadow of the Tomb Raider
  • Hitman 3
  • Far Cry 6
  • Cyberpunk
  • Dying Light 2
  • Halo Infinite
  • Spiderman
  • Counterstrike

Games, pre-DLSS3, that had Reflex available in that suite? Some I have first hand experience, others I've searched so I may be wrong on this - but so far, I count one. Rainbox Six.

I don't think it has been updated with DLSS3 announced games, but Nvidia's own total list now consists of 44 games with Reflex support, and quite a few of those you will never see in a GPU comparison video because they're either poor choices for GPU benchmarks or just nobody plays them. That's 44 games in 2 years, few flagship titles. It simply hasn't been that widely adopted.
 
Last edited:
stuff like Ansel, Shadowplay Highlights, Game Filter and Auto Game Optimizations .. none of these stuff has any kind of answer from AMD, doesn't mean they are bad because they happen to be "proprietary".

Shadowplay Highlights? Game Optimizations? I mean come on man, Shadowplay Highlights is a flop in actual adoption, it's 5 years old now - AMD isn't 'answering' that because hardly anyone used it. Game optimizations may be helpful to newbies sure, but based on the recommendations I've seen it give I wonder how many experiences it's actually fucking up - have you ever seriously used it instead of just going into a game and adjusting its settings? Game Filter is a less useful Reshade, I used it when it first came out then just gave up as too many games didn't support it and Reshade was far more compatible and with better shaders.

Nvidia's software suite does have some advantages no doubt, stuff like background removal for streamers and voice RTX are valuable. It's also a complete mishmash of UX and various applets, their Control Panel is a laughable abomination compared to the fully integrated suite like Adrenalin.
 
Last edited:
@The-Nastiest hardware unboxed does great monitor reviews, although I guess that's its own channel now. Their gpu reviews aren't bad, but their cpu reviews are not great. Then they make stuff like this
. That's just a bad video. Comparing DDR4 3200 to DDR5 and saying it's time to leave DDR4. That's just bad advice. You can buy viper steel DDR4 kits for cheap that go up to DDR4-4000. Pretty sure a 12600K can handle DDR4 at that speed no problem. And for memory overclockers, the viper steels are B-die so you can tighten timings A LOT and get very low latency. Like every other channel, they have good and bad content, but they respond VERY poorly to criticism.

The content of their DLSS3 video was overall pretty good, but they made a bunch of assumptions that I just don't think are true. The 4050, if it comes out, should be able to handle 60fps pretty easily, unless you're trying to run 4k or Ultra settings. The interesting question with a 4050 is how well frame generation works at lower resolutions. How many people are trying to game on 4k monitors with a 4050? They also say it's better to run DLSS2 performance than DLSS3 quality with frame generation because it "looks better" and has lower input lag. The looks better part, I'll have to see, because I think DLSS performance looks pretty bad, at least on a 1440p screen. There seems to be an assumption that people would never play a game on medium to use frame generation to get to 144, 165, 185, 240, 360 Hz. Like, would 120 fps + frame generation on medium settings to hit your refresh rate not make sense?

4K monitors are 2.5% of the population on steam. 1080p is 66% and 1440p is 11%. I wish the steam survey had refresh rate as part of the data. I wouldn't be surprised if there are more high refresh rate 1080p screens than there are total 4k screens.
 
@Flappy Pannus

Reflex has been supported in Call of Duty(four iterations), Apex Legends, Fortnite, Overwatch and Valorant. The player base for those games is enormous. There are more games all the time, like God of War.

I don't think reviewers are biased in a malicious way, but the games they select and how they test them reflect their biases. They don't test latency. Simple fact is they way they benchmark gpus is to turn up the settings so the gpu is absolutely maxed out, and then they show you the bar graph with average and 1% lows and say the one with the longer bars is better. But if the gpu is maxed out, you're adding a ton of latency, which is what reflex addresses. Typically you'll see your frame rate drop slightly with Reflex, but the latency is greatly reduced. So in that case does the bar graph win, or does the latency win? They don't even talk about it.

Now there are DLSS3 titles that include manadatory reflex support, and suddenly 10 or 20ms matters, even though native without reflex is how they do all of their benchmarks. It's just weird. It's now how they approach gpu reviews or recommendations. In every game that supports reflex, the Nvidia gpus are going to win in terms of latency when the game is gpu limited, regardless of whether your native, DLSS or FSR. The best option on AMD and Intel gpus is to cap your frame rate or lower your settings so you stay below maybe 97% utilization.
 
few flagship titles. It simply hasn't been that widely adopted
Few? Try all flagship titles .. Call of Duty Modern Warfare, Warzone, Black Ops Cold War, Vanguard, Battlefield 2042, Valorant, Overwatch, Fortnite, PUBG, Apex Legends, Crossfire, Enlisted, Destiny 2, Rainbow Six Siege, War Thunder .. pretty much most competitive shooters and the most popular titles on PC. That is SIGNIFICANT.

I think it's reasonable to bear in mind that Reflex was still relatively sparsely adopted in games before now
44 famous titles, are anything but sparse.

pre-DLSS3, that had Reflex available in that suite? I count one
Please don't absolve them of this, they've tested many of Reflex titles in the past and neglected to even mention Reflex once, they tested lots of Call of Duty games, Battlefield, Fortnite, PUBG ..etc, these games had Reflex since 2020. Not once did they mention the advantage Reflex gives to NVIDIA GPUs not once did they compare. latency on NVIDIA vs AMD GPUs.

Anyway .. Now they will have Spider-Man, Cyberpunk and many others .. let's see if they change tactics.

It's designed to be implemented in competitive games, you won't see it anywhere else. And it's still useful today.

have you ever seriously used it instead of just going into a game and adjusting its settings?
I have tested it and it works well, many newbies find it very helpful too.

Game Filter is a less useful Reshade, I used it when it first came out then just gave up as too many games didn't support it and Reshade was far more compatible and with better shaders.
On the contrary, Reshade needs too much work to get set up, FreeStyle is just two or three clicks and you are set, it gives you access to a lot of effects with ease. Lots of AO, SSGI, Depth of Field, color mapping/grading, sharpnening and even motion blur.

AMD isn't 'answering' that because hardly anyone used it
Let's not pretend AMD answers everything NVIDIA does (software or hardware wise), because they clearly don't, it's not a matter of adoption, their priorities are completely different. Let's see them respond to RTX Remix!
 
Last edited:
Like every other channel, they have good and bad content, but they respond VERY poorly to criticism.
I fully agree. You just demonstrated with your video choice. As a blanket statement saying 'ditch ddr4 and get ddr5' is bad advice. I don't care to watch that specific video to see if there was any more context, but once again I agree with your statement above.

The interesting question with a 4050 is how well frame generation works at lower resolutions. How many people are trying to game on 4k monitors with a 4050? They also say it's better to run DLSS2 performance than DLSS3 quality with frame generation because it "looks better" and has lower input lag. The looks better part, I'll have to see, because I think DLSS performance looks pretty bad, at least on a 1440p screen.
Again, I'm in line with your thinking here as well. Im just happy that the HU overview of DLSS 3 even mentioned how the tech will scale with lower end cards. Its an intro to DLSS 3 as a tech overall, so its good to try and give a full picture of the tech overall.

Its easy to view DLSS 3 at the top of the stack and see it handle high resolutions and high framerates well, but as you point out below, there may be a lot of users who play at lower resolutions with lower refresh rate monitors. So, while HU thoughts aren't definitive, it is good that they mentioned potential issues that may arise and then gave a practical example explaining how a low/mid tier card at lower fps may be more distracting due to longer individual frame persistence for the generated frame.

(Also, you mentioned frame generation at lower resolutions, I will raise you 1 further and say that I don't think DLSS 2 looks good at anything lower than 1440p quality. Personally, I don't think the ML program has enough inputs to generate a good enough image for my tastes.)


There seems to be an assumption that people would never play a game on medium to use frame generation to get to 144, 165, 185, 240, 360 Hz.

Its fully possible that people will do that, however, historically, people lower settings to get higher framerates and lower input latency. That has always been the correlation. In your example the user is going to lower their settings so they can 'natively' hit a high refresh rate...then turn on frame generation to visually simulate a higher refresh while keeping or slightly increasing their games responsiveness?

So while your hypothetical is possible (i mean its PC so you can do whatever you want) it would be a very odd trade off to make.

4K monitors are 2.5% of the population on steam. 1080p is 66% and 1440p is 11%. I wish the steam survey had refresh rate as part of the data. I wouldn't be surprised if there are more high refresh rate 1080p screens than there are total 4k screens.
Yeah, by all public accounts, and probably our own anecdotal experiences as well, we can say that 4K adoption is low relative to everything else.

But since we only have the top card, whats the point of looking at any res below 4K. When the low/mid tier RTX40 cards come out im sure channels will revisit DLSS 3 to give a more accurate representation of how it handles.
 
Reflex has been supported in Call of Duty(four iterations), Apex Legends, Fortnite, Overwatch and Valorant. The player base for those games is enormous. There are more games all the time, like God of War.

Sure, the competitive shooters do - but those make up a tiny fraction of most GPU benchmark suites used in reviews, largely because they're so easy on the GPU and are better used for CPU benchmarks. The fact of the matter is before now, its penetration is very small across the wide range of games people play, and especially across the graphically demanding games in GPU reviews.

I don't think reviewers are biased in a malicious way, but the games they select and how they test them reflect their biases. They don't test latency.

Before DLSS3 coverage though, who does this as standard part of their GPU reviews? Can you point to an established site where this is commonplace? There are definitely separate videos on reflex as a tech, but even Digital Foundry hasn't made this a focus, I guess as long as you don't count literal Nvidia sponsored videos. Like I said, chances are the majority of games used in a benchmark suite don't support Reflex regardless, so it's not even an option.

There are DLSS3 titles that include manadatory reflex support, and suddenly 10 or 20ms matters, even though native without reflex is how they do all of their benchmarks. It's just weird. It's now how they approach gpu reviews or recommendations. In every game that supports reflex, the Nvidia gpus are going to win in terms of latency when the game is gpu limited, regardless of whether your native, DLSS or FSR. The best option on AMD and Intel gpus is to cap your frame rate or lower your settings so you stay below maybe 97% utilization.

I think HU is overplaying the latency concern on the 4090 with the framerates they're targeting, sure. Otoh, I think you may be overplaying Reflex's benefit as well, to where you think it's egregious bias if latency hasn't been a standard concern for GPU reviewers.

1665692131399.png

A significant point of reviews, if not their only point, is to determine "Does this product in actual use validate the claims of its manufacturer?"

Latency is being brought up now, because Nvidia is highlighting it as part of their Ada/DLSS3 marketing. It is being brought up now, because Nvidia, through their performance % graphs for Ada, are effectively saying "These performance improvements with DLSS3 are directly comparable to the previous gen". HU's angle seems to be largely focused on that argument - that this is not the equivalent of the benefits you get from pre-DLSS3 technologies at the same frame rate. They see the situations where DLSS3 can benefit, and are also extrapolating that to note their concerns with how this will effectively downscale to the product class that most people will actually be able to afford (?). That I think is where part of their focus on latency is coming from as well - that it's kind-of a problem now (my words, again I think they're overstressing it in the context of the 4090), but may be far more significant when it will inevitably be marketed as a massive performance-enhancing feature for the lower-end cards.

Premature? Yeah, perhaps - by the time the 4060/4070 come out maybe some of these drawbacks will be minimized further with updates, that's possible. I think part of the disagreement here though is that HU is approaching this primarily as a critique of the feature as to what it brings to the value proposition of a product line, not necessarily as the technology itself. It's difficult as its benefits will ebb and flow depending on so many factors - the latency of the game to begin with, the starting frame rate, the type of motion, etc.

I definitely think DF is a better resource to get a handle on what exactly the tech is doing, and I think that's true in many cases compared to the DIY PC tech sites. But they have different approaches, GN/HB come at things more from a consumer value perspective and are more of a 'what are you doing for me now', I think that is a part of valid coverage too.
 
Last edited:
Game Filter is a less useful Reshade, I used it when it first came out then just gave up as too many games didn't support it and Reshade was far more compatible and with better shaders.

I think Freestyles awesome and you've actually just reminded me of another reason why I will probably choose to stay with Nvidia for the next few years unless AMD pulls another 9700pro.

It's way, way more convenient and user friendly than Re-shade and while I agree it's not as powerful, for some simple colour grading changes or sharpening / de-sharpening effects which is often all I want in games, it's super easy to just fire up and see the results in real time.

I do find the latency arguments with DLSS 3 interesting. It's worth remembering that DLSS 3 doesn't inherently increase latency, it actually reduces latency in every single game that it's enabled in. DLSS 3 is a huge latency win and yet it's somehow being reviewed in some quarters as borderline pointless because of latency increases. The reason being of course that every game with DLSS 3 features Reflex, and users are under no obligation to activate frame generation alongside it. So DLSS 3 = lower latency than the competition in every game - with the option of trading in that gain for much smoother framerates. But I've not seen any site frame it in that way yet.
 
Latency is bring brought up now, because Nvidia is highlighting it as part of their Ada/DLSS3 marketing
NVIDIA has been highlighting massive latency reductions through Reflex for two years across dozens of titles, but HU never cared for it UNTIL NOW, pretty strange don't you think? Just like Ray Tracing has been raging on for 4 years and their opinion is still the same "it's useless".

They have this tendency to pick and choose whatever based on whatever arbitrary reasons, not what the current or future landscape is or will be .. they tested lots of early DX12 titles because it's the future, but DXR? No dice! They downplayed DLSS1 and DLSS2 to hell and back, they only accepted DLSS2 after it got so much wide adoption, they then praised FSR1 and called it a DLSS alternative, then praised FSR2 all the same!
 
Few? Try all flagship titles .. Call of Duty Modern Warfare, Warzone, Black Ops Cold War, Vanguard, Battlefield 2042, Valorant, Overwatch, Fortnite, PUBG, Apex Legends, Crossfire, Enlisted, Destiny 2, Rainbow Six Siege, War Thunder .. pretty much most competitive shooters and the most popular titles on PC. That is SIGNIFICANT.

It's largely irrelevant for GPU reviews, as they actually cover a wide variety of games with demanding graphics load beyond competitive shooters. Read the post to see the actual context.

Also, how significant is it for these games? What kind of latency reductions are we talking about when you're at 200fps before enabling it? If a Fortnite player - when enabling RT - absolutely! For the other games in your list?

1665693895127.png


Please don't absolve them of this, they've tested many of Reflex titles in the past and neglected to even mention Reflex once,

They covered Reflex in the exactly same way ever other view channel covered it - they did a separate video on it, but they don't include latency as part of their GPU reviews.

they tested lots of Call of Duty games, Battlefield, Fortnite, PUBG ..etc, these games had Reflex since 2020. Not once did they mention the advantage Reflex gives to NVIDIA GPUs not once did they compare. latency on NVIDIA vs AMD GPUs.

Simple solution to prove their bias on this front is to point to a well known youtube review channel that has included latency as part and parcel of their GPU reviews. Give me one.

It's designed to be implemented in competitive games, you won't see it anywhere else.

(Re: Shadowplay highlights support)

Raiders of the Broken Planet!
Red Faction: Guerrilla Remastered Edition!
My Time at Portia!
Anthem!
Shadow of the Tomb Raider!

1665694895550.png

Ok I'm being a smartass. However, is there a reliable list of currently released games that support highlights? The problem is that there is a listing for Geforce Now games, but they go about things differently, Nvidia has done extra work for Geforce Now games that are in the cloud for features such as these. That list on local games may not be up to date.

And it's still useful today.

....ok? What does this mean, is someone arguing that it doesn't work in those games anymore? I'm saying it's inclusion in games is a rarity, I am arguing against it being a significant detriment vs AMD because of that.

On the contrary, Reshade needs too much work to get set up

When was the last time you actually used it, really? It literally pops up a window and populates it with your installed games. You click the game, select the API, and select your pack. Boom.

Let's not pretend AMD answers everything NVIDIA does (software or hardware wise), because they clearly don't, it's not a matter of adoption, their priorities are completely different. Let's see them respond to RTX Remix!

What a weird comment in response to what I said. I swear some on this forum can make r/nvidia blush at points.
 
Back
Top