Nvidia DLSS 3 antialiasing discussion

With all of this reflex and latency talk, I may be getting confused. Can someone answer this scenario for me?

'Native' engine generated 120fps output, reflex on = 'looks like 120fps, plays like 120fps'

DLSS 3 120fps output = 'looks like 120fps, plays like 'native' 60fps, reflex on'

Is that how it works and if so, would you consider that the same experience?
 
NVIDIA has been highlighting massive latency reductions through Reflex for two years across dozens of titles,

Citation needed. It's worthiness keeps being trotted out due to its inclusion in competitive shooters...but from the numbers I've seen, it actually provides relatively small benefits, outside of enabling RT in fortnite - which uh, not sure how many in that game actually do. Can you point to other benchmarks?

but HU never cared for it UNTIL NOW, pretty strange don't you think?

Here's their video on reflex. Again, just point to me an established GPU reviewer that has included latency measurements as standard practice as part of their GPU reviews.

Just like Ray Tracing has been raging on for 4 years and their opinion is still the same "it's useless".

They have this tendency to pick and choose whatever based on whatever arbitrary reasons, not what the current or future landscape is or will be .. they tested lots of early DX12 titles because it's the future, but DXR? No dice! They downplayed DLSS1

I agree with the criticism of their Radeon reviews where DXR is nonexistent, they were ridiculous. But why even mention DLSS1 being 'downplayed'? I mean you're talking about bias then throw that out there? They were correct, it fucking sucked.

they then praised FSR1 and called it a DLSS alternative, then praised FSR2 all the same!

Yup! Their FSR 1.0 video was shit. I have my gripes with DLSS and some negative aspects I think have been downplayed a bit, but I've tried FSR 1.0 in many games that have DLSS support as well and it's not remotely comparable.
 
Last edited:
So DLSS 3 = lower latency than the competition in every game - with the option of trading in that gain for much smoother framerates. But I've not seen any site frame it in that way yet.

I guess Nvidia shouldn't have fucked up by marketing 'DLSS3' as joined to the hip with the frame generation part then? I mean come on now, any site covering "DLSS3" is obviously going to be focusing on the frame generation, reflex is 'nice' as a part of it but it's clearly not the impetus by jumping DLSS to another version #.

If Nvidia made major improvements to DLSS's reconstruction before the frame generation, they could have called that DLSS3 and have the FG be a separate thing. But this is how they want you to see DLSS3. Tough cookies.

And again, it's fine to to tout DLSS3 as a free bonus - even HB does that. But Nvidia touted it as part of a performance uplift against the previous gen. Reviewers are going to evaluate the most bombastic claims by a company, that's what they should do.
 
Last edited:
With all of this reflex and latency talk, I may be getting confused. Can someone answer this scenario for me?

'Native' engine generated 120fps output, reflex on = 'looks like 120fps, plays like 120fps'

DLSS 3 120fps output = 'looks like 120fps, plays like 'native' 60fps, reflex on'

Is that how it works and if so, would you consider that the same experience?

"Same"? Perhaps not, just as DLSS, even at quality, isn't the 'same' as native either. It's definitely worth mentioning in a review and showing the numbers, then you have to spend time contextualizing those. DLSS3 just makes this more complex.

Reminder that latency can vary wildly with games even at the same framerate, it's still largely in the hands of the developer from the outset if they want to prioritize it. The latency of one game at 60fps may suck so much that the improvement to 120 hits you smack in the face. In another, it may be more subtle and the improvement in motion resolution is far more apparent. This changes in the types of games, the input method, how that input method is delivered to (eg: gamepads in some games can have a significantly different 'feel' based solely on the acceleration curve).
 
@Flappy Pannus

Reflex has been supported in Call of Duty(four iterations), Apex Legends, Fortnite, Overwatch and Valorant. The player base for those games is enormous. There are more games all the time, like God of War.

I don't think reviewers are biased in a malicious way, but the games they select and how they test them reflect their biases. They don't test latency. Simple fact is they way they benchmark gpus is to turn up the settings so the gpu is absolutely maxed out, and then they show you the bar graph with average and 1% lows and say the one with the longer bars is better. But if the gpu is maxed out, you're adding a ton of latency, which is what reflex addresses. Typically you'll see your frame rate drop slightly with Reflex, but the latency is greatly reduced. So in that case does the bar graph win, or does the latency win? They don't even talk about it.

Now there are DLSS3 titles that include manadatory reflex support, and suddenly 10 or 20ms matters, even though native without reflex is how they do all of their benchmarks. It's just weird. It's now how they approach gpu reviews or recommendations. In every game that supports reflex, the Nvidia gpus are going to win in terms of latency when the game is gpu limited, regardless of whether your native, DLSS or FSR. The best option on AMD and Intel gpus is to cap your frame rate or lower your settings so you stay below maybe 97% utilization.
For what I read, it provides solutions for both gpu and cpu bound scenarios. When a game is CPU bound and GPU is underutilized, Reflex will force the gpu to keep its clock high so that any work from the cpu dropping late and in large batches is readily handled.
 
It's largely irrelevant for GPU reviews
It's not for HU, they use lots of these competitive games in their one vs one GPU videos, like 5700XT vs 2060 Super for example, You will see lots of Fortnite, PUBG, Warzone, War Thunder testing, lots of handwaving for ray tracing, and lots of pretending that FSR2 equals DLSS2, and how this somehow makes AMD on equal footing with NVIDIA, even though RTX GPUs support both DLSS2 and FSR2.

What kind of latency reductions are we talking about when you're at 200fps before enabling it? If a Fortnite player - when enabling RT - absolutely! For the other games in your list?
See here, Reflex often shaves off 50% of latency in titles such as Fortnite (no RT), Cold War, God Of War, Destiny 2 ..etc. It's substantial.


but they don't include latency as part of their GPU reviews
Yes, and that's exactly the crux of the matter, they covered Reflex in isolation, but never managed to include it as a factor of purchase in reviews, or never made it a comparison point between AMD and NVIDIA.
....ok? What does this mean, is someone arguing that it doesn't work in those games anymore? I'm saying it's inclusion in games is a rarity, I am arguing against it being a significant detriment vs AMD because of that.
No body called it a detriment against AMD, it's just a bonus feature for NVIDIA ecosystem, that is not available on AMD. Let me just tell you that it's usefulness in laptops is beyond significant.

What a weird comment in response to what I said.
I was responding to your argument that AMD don't respond to NVIDIA features on the basis of adoption rate, that's simply not true, AMD don't respond because either they can't as in they the technology, or won't because they are on a tight budget or have other priorities.

When was the last time you actually used it, really? It literally pops up a window and populates it with your installed games. You click the game, select the API, and select your pack. Boom.
With FreeStyle you select the effect from a drop menu and see your changes in real time. Much much more convenient. Also FreeStyle already integrated several of the most famous ReShade features.

is there a reliable list of currently released games that support highlights?
I don't have it, and it's besides the point anyway.

I agree with the criticism of their Radeon reviews where DXR is nonexistent
but I've tried FSR 1.0 in many games that have DLSS support as well and it's not remotely comparable.
I am glad we are on the same page regarding these two points.
 
Last edited:
I find the latency conversation around DLSS pretty weird, to be honest. I've never seen any of the youtuber reviewers care about latency before. I've never seen them compare latency when evaluating gpus in reviews. They have never considered latency as part of performance when reviewing, just average fps, 1% low and 0.1% low. This is a purely hypothetical scenario but say there's a game x and they're benchmarking a 3080 vs a 6800. Would you ever hear them say, the 6800 has 100 fps, and the 3080 has 90 fps, but the 3080 has nvidia reflex leading to 10ms lower latency, therefor it is performing better and receives our recommendation? Have you ever heard latency brought up in terms of relative performance of games? When people are talking about how well "optimized" games are, do they ever measure latency? The general perception is if it scales with hardware, and if the fps numbers are high, then it's "optimized" but you never see them measure latency. Some games have a lot more latency than others at any given frame rate.

Many people would be shocked to find out that the click latency, or the motion latency of mice can vary by as much as 20 ms, even with mice that are marketed as being for gaming. People switch mice all of the time and never notice. I think a g pro superlight is close to if not the best performer. If you show them a chart that shows the relative latency of peripherals suddenly they start replacing them, but if they've been gaming a long time they've probably switched from low latency to higher latency peripherals without knowing or caring. It's kind of where showing charts comparing latency with frame generation on or off can be tough, because you can see the difference between the numbers, but unless you use it you won't know if you can actually feel it. I'm not saying they can't, but it'll vary by person. On top of that you can lower latency on mice by increasing cpi.

I am particularly latency sensitive. I might actually notice the differences in frame generation, and not be willing to accept it. I'm very accustomed to gaming at 100+ fps, usually closer to 200 fps, and that was true even when I had mid range cards like the gtx 1060. What did I do? I lowered the settings. The idea that DLSS3 might not be viable on a 4050 or 4060 is weird to me, because you just lower the settings to hit 60fps and then add the frame generation. Some people would play at ultra settings on those gpus, some would not. It's just another tool for people to take advantage of.

Even being sensitive to latency there is probably a point where I stop being able to tell differences. I think it's probably around the 120 fps mark vs anything higher in the same game. I can definitely tell when my gpu hits 100% and latency starts piling up, even at high frame rates. That's why I'll frame limit all of my games if they don't support nvidia reflex.

One thing I need to see is comparisons of DLSS3 at 1440p and 1080p. Those are still the most common resolutions. Hopefully someone will test it when the 1080p 500Hz displays are launched, as well as the 1440p 360Hz displays.

Edit: One thing I'll add is they're weirdly making the argument that AMD gpus are vastly inferior. On the input latency graphs they show native vs native (reflex off). The reflex off is probably around where the AMD gpus would sit, assuming frame rates are similar, because they don't have a comparable technology. But they care so much that they've never made that comparison before. It's weird.

Edit: Some data on the difference in latency between similar class Nvidia, AMD gpus. I miss Battlenonsense's content.


You will be amazed by what people know today but still ignorant about otherwise. My son knows little to nothing about a gpu. But he knows frame rates and latency. Why? Because he played alot of fortnite. He has asked for gaming mice and keyboards that he vetted personally by reading reviews and from other sources. He would use all the technical jargon when espousing how he needed these products. Not because he had dreams of being a professional but rather bragging rights amongst his friends.

SMH. And this started when he was 10 or 11 with a request for KBM setup for his console.
 
I fully agree. You just demonstrated with your video choice. As a blanket statement saying 'ditch ddr4 and get ddr5' is bad advice. I don't care to watch that specific video to see if there was any more context, but once again I agree with your statement above.


Again, I'm in line with your thinking here as well. Im just happy that the HU overview of DLSS 3 even mentioned how the tech will scale with lower end cards. Its an intro to DLSS 3 as a tech overall, so its good to try and give a full picture of the tech overall.

Its easy to view DLSS 3 at the top of the stack and see it handle high resolutions and high framerates well, but as you point out below, there may be a lot of users who play at lower resolutions with lower refresh rate monitors. So, while HU thoughts aren't definitive, it is good that they mentioned potential issues that may arise and then gave a practical example explaining how a low/mid tier card at lower fps may be more distracting due to longer individual frame persistence for the generated frame.

(Also, you mentioned frame generation at lower resolutions, I will raise you 1 further and say that I don't think DLSS 2 looks good at anything lower than 1440p quality. Personally, I don't think the ML program has enough inputs to generate a good enough image for my tastes.)




Its fully possible that people will do that, however, historically, people lower settings to get higher framerates and lower input latency. That has always been the correlation. In your example the user is going to lower their settings so they can 'natively' hit a high refresh rate...then turn on frame generation to visually simulate a higher refresh while keeping or slightly increasing their games responsiveness?

So while your hypothetical is possible (i mean its PC so you can do whatever you want) it would be a very odd trade off to make.


Yeah, by all public accounts, and probably our own anecdotal experiences as well, we can say that 4K adoption is low relative to everything else.

But since we only have the top card, whats the point of looking at any res below 4K. When the low/mid tier RTX40 cards come out im sure channels will revisit DLSS 3 to give a more accurate representation of how it handles.

Yah, I agree about DLSS at less than 1440p. When I first got my 3080 I had a 1080p 144Hz monitor and even DLSS quality did not look great. I think that's part of the reason why esports player think DLSS looks bad. They just haven't seen it on higher resolution displays.

In terms of the tradeoff between latency and frame rate, I think there's a pretty good use case on high-end cards for hitting 240 Hz, 360 Hz or 480 Hz. It is incredibly hard to get CPUs that will actually not become the bottleneck when targetting very high refresh displays. Oddly enough there are people with 3080/3090 tis and 1080p 360Hz displays right now. If you're an actual esports competitor, you'll probably want to turn off the frame generation to get the absolute lowest input latency. If you're just a fan of Call of Duty Warzone, even at the absolute lowest graphical settings and 1080p it's very hard to stay over maybe 220 fps with a heavily overclocked 12900k and highly tuned DDR4. Maybe a good trade is to increase latency a bit so you can max out your 360Hz or higher display? You're probably still going to have very good latency if you're starting point is say 150 fps or higher.

I think the most common resolution going forward is going to be 1440p, even for people with 4080s and 4090s, but the majority of people, even with high end gpus is still 1080p. I think some image quality analysis there is important.
 
It's not for HU, they use lots of these competitive games in their one vs one GPU videos, like 5700XT vs 2060 Super for example, You will see lots of Fortnite, PUBG, Warzone, War Thunder testing, lots of handwaving for ray tracing, and lots of pretending the FSR2 equals DLSS2, and how this somehow makes AMD on equal footing with NVIDIA, even though RTX GPUs support both DLSS2 and FSR2.


See here, Reflex often shaves off 50% of latency in titles such as Fortnite (no RT), Cold War, God Of War, Destiny 2 ..etc. It's substantial.


Nvidia chose the PS5 as the basis for the comparison here so the display restriction will be on 60hz displays, this will show the improvements of reflex in the best light, which is why you don't use sponsored videos to accurately judge the benefits of a feature, come on man. This is especially true with competitive shooters where people are usually playing on high refresh rate displays.

Note how they don't show the latency improvements with vsync off by itself, only with vsync off + reflex.

Yes, and that's exactly the crux of the matter, they covered Reflex in isolation, but never managed to include it as a factor of purchase in reviews, or never made it a comparison point between AMD and NVIDIA.

Again, still waiting for that link where other GPU reviewers have done this.

With FreeStyle you select the effect from a drop menu and see your changes in real time. Much much more convenient. Also FreeStyle already integrated several of the most famous ReShade features.

You see the results in 'real time' with Reshade too, from an overlay. It actually works with all games to boot. Like, how do you think it works? You sound like you're describing the early versions from years ago.

I don't have it, and it's besides the point anyway.

Whether a feature is actually supported in games is relevant to its worth as a selling point over the competition. Any supposed innovation is worthless in a vacuum.
 
Last edited:
Sure, the competitive shooters do - but those make up a tiny fraction of most GPU benchmark suites used in reviews, largely because they're so easy on the GPU and are better used for CPU benchmarks. The fact of the matter is before now, its penetration is very small across the wide range of games people play, and especially across the graphically demanding games in GPU reviews.

Before DLSS3 coverage though, who does this as standard part of their GPU reviews? Can you point to an established site where this is commonplace? There are definitely separate videos on reflex as a tech, but even Digital Foundry hasn't made this a focus, I guess as long as you don't count literal Nvidia sponsored videos. Like I said, chances are the majority of games used in a benchmark suite don't support Reflex regardless, so it's not even an option.

I think HU is overplaying the latency concern on the 4090 with the framerates they're targeting, sure. Otoh, I think you may be overplaying Reflex's benefit as well, to where you think it's egregious bias if latency hasn't been a standard concern for GPU reviewers.

<image removed>

A significant point of reviews, if not their only point, is to determine "Does this product in actual use validate the claims of its manufacturer?"

Latency is being brought up now, because Nvidia is highlighting it as part of their Ada/DLSS3 marketing. It is being brought up now, because Nvidia, through their performance % graphs for Ada, are effectively saying "These performance improvements with DLSS3 are directly comparable to the previous gen". HU's angle seems to be largely focused on that argument - that this is not the equivalent of the benefits you get from pre-DLSS3 technologies at the same frame rate. They see the situations where DLSS3 can benefit, and are also extrapolating that to note their concerns with how this will effectively downscale to the product class that most people will actually be able to afford (?). That I think is where part of their focus on latency is coming from as well - that it's kind-of a problem now (my words, again I think they're overstressing it in the context of the 4090), but may be far more significant when it will inevitably be marketed as a massive performance-enhancing feature for the lower-end cards.

Premature? Yeah, perhaps - by the time the 4060/4070 come out maybe some of these drawbacks will be minimized further with updates, that's possible. I think part of the disagreement here though is that HU is approaching this primarily as a critique of the feature as to what it brings to the value proposition of a product line, not necessarily as the technology itself. It's difficult as its benefits will ebb and flow depending on so many factors - the latency of the game to begin with, the starting frame rate, the type of motion, etc.

I definitely think DF is a better resource to get a handle on what exactly the tech is doing, and I think that's true in many cases compared to the DIY PC tech sites. But they have different approaches, GN/HB come at things more from a consumer value perspective and are more of a 'what are you doing for me now', I think that is a part of valid coverage too.

I've never said there's egregious bias. I said there's non-malicious bias. Everyone has biases, because we all have preferences. Some people just don't care about latency, and to be honest, that's fine. It's just weird to suddenly care about the impact of latency, when you never did before. If latency isn't part of the standard review process, a lot of context is missing in how big the impacts of frame generation latency really are.


Battlenonsense was the first person I saw making content about input latency. He started testing it when Radeon anti-lag and Nvidia NULL came out. He discovered he could outperform those technologies in latency reduction by capping frames. He forwarded that information to Nvidia and about a year after that they released Nvidia Reflex (not that I'm giving him credit for them making it).

His vid about capping frame rate to reduce input lag

His video comparing Nvidia Reflex to Radeon Boost

On modest GPUs at the time, the input latency savings are notable, but become smaller as frame rate gets higher. Results will vary game to game depending on how many frames they'll queue when the gpu is bottlenecked, and how much other lag the different games systems add. Reflex gives you the most benefit the lower your frame rate, so by HU's latency concerns it should be of great importance to people buying lower end cards like nvidia x050, x060. Like, if latency is a concern then they should be recommending to turn off frame generation and buying Nvidia over AMD until AMD comes out with a similar solution.

From skimming the battlenonsense video again the reflex savings are as follows

gtx 1050 vs rx580 99% gpu load @ around 65fps
overwatch -36ms from native, -34 ms from amd native
fortnite -28 ms from native, -30 ms from amd native
apex -20 ms from native, -20 ms from amd native

gtx 1650 super vs rx 5500 xt 99% gpu load @ around 143 fps
overwatch -20ms from native, -16 ms from amd native
fortnite -14 ms from native, -15 ms from amd native
apex -8 ms from native, -9ms from native

Edit: As for who includes latency comparisons in their reviews? I don't think anyone does. If latency is important, then reflex is a feature that's existed for two years, not a new thing with DLSS 3. It's absolutely fine to do the comparison of frame generation on/off to see what the impact is. It's actually ideal. They've just been forgetting some of the most played games have had reflex when using Nvidia gpus, and the other gpu vendors do not have a competing technology.

Edit2: Battlenonsense is an absolute legend and it's a shame his videos didn't blow up. His content is way more detailed and higher quality than a lot of the benchmark farmers. I started watching him because of his netcode analysis of Battlefield 4, which moves on to netcode analysis of other games, explanations of bufferbloat and mitigations, and then moving on to input lag from games and peripherals. When he made the video about capping frames, the only time I'd used a framerate limiter was to keep the laptop I was playing Overwatch on from overheating. Once I got a real PC, I was playing uncapped with vsync off to get "the least lag" as everyone said. It was the common knowledge for pc gamers to do this, and it was entirely wrong until battlenonsense actually exposed it (at least I'd never seen anyone else make a video about it). It's a shame he's retired from making videos, because he was a perfect example of objective scientific inquiry and professionalism when it came to gaming.
 
Last edited:
Note how they don't show the latency improvements with vsync off by itself, only with vsync off + reflex.

In Cyberpunk, native without Reflex is 108ms, with Reflex it's 62ms, DLSS3 performance is 54ms. So NVIDIA is providing better latency than AMD, even with DLSS3. This distinction is important. Portal RTX shows similar things, native without Reflex is 129ms, with Reflex it's 95ms, and DLSS3 performance is 53ms .. so same story.

You can't simply claim that DLSS3 makes the latency worse for players, when AMD players are playing with worse latency than DLSS3 for years, and no one complained.


You can see the same thing in the HU video as well, but HU conveniently left that info out. He even claims in the video that no one would consider playing with Reflex off, as it is a very unrealistic scenario! Well, all AMD GPU owners are playing with Reflex off. So IT IS a very realistic scenario. If you are going to suddenly care about latency, then you need to factor that in, not ignore it. You need to tell people that DLSS3 is not making their latency worse (in the absolute sense), because AMD users are running their games with far worse latency without much of a complaint .. do you get it?


Whether a feature is actually supported in games is relevant to its worth as a selling point over the competition.
Shadowplay Highlights, Ansel, Ansel RTX, FreeStyle, Reflex, Auto Game Optimizations, AI video/audio effects .. all have had, and still have varying degrees of support among dozens of games. Will see how it goes with RTX Remix.
 
Last edited:
Edit: As for who includes latency comparisons in their reviews? I don't think anyone does. If latency is important, then reflex is a feature that's existed for two years, not a new thing with DLSS 3. It's absolutely fine to do the comparison of frame generation on/off to see what the impact is. It's actually ideal. They've just been forgetting some of the most played games have had reflex when using Nvidia gpus, and the other gpu vendors do not have a competing technology.

If no one does it, then it's not a particular discrepancy with Hardware Unboxed, so it makes little sense to call them out specifically for highlighting latency with this video when nearly every other reviewer who looks at DLSS3 will also be mentioning latency and also not have a past history of caring much about it before now. I mean Optimum Tech specifically said the latency addition is noticeable and a negative mark against it, does he have a history of covering latency? Nope.

Nvidia has made two prominent claims for DLSS3:

1) The boost you get with DLSS3 provides a massive % performance boost. Look at these graphs! This is like jumping 2 generations!
2) DLSS3's latency addition is extremely minimal, don't worry about it.

HU's view on these seems to be as follows:

1) It's not the same thing, as performance without frame generation would also bring about a reduction in latency which is one of the main benefits of increased framerate, which DLSS3 doesn't give. That, and increased image artifacts that you don't get with 'just' reconstruction.
2) Nvidia is making claims of only a very slight increased latency with DLSS3, but they're doing so by comparing games without Reflex. HU's angle here is that since every game with DLSS3 will support reflex, that's unfairly masking the latency addition - or rather, lack of reduction - you would get without the generated frames.

The critique is about how honestly Nvidia has sold this feature to the public.

Now again, it's good that DLSS3 will enable reflex for all RTX GPU's, if that means more support across games and in turn, more focus on latency and more pressure to make it a priority, hell yes! If there's one benefit of that Nvidia sponsored video from DF, is it highlights how poor latency can be on the PC in some games when using a fixed refresh rate display at ~60fps with vsync. It certainly has been my experience more often than I'd like when comparing controller response on the same game on PC vs PS5 with my TV, I think this subset of PC gaming could definitely be improved (albeit I realize I'm in a distinct minority amongst PC gamers and VRR/HFR adoption makes this far less critical). Maybe the focus on this will engender some kind of vendor agnostic approach?

But hey, this is the bed Nvidia made! As a way to assuage fears about DLSS3's impact on latency, they're also potentially greatly expanding the reach for their solution to the problem that can potentially raise (lower?) the baseline without FG. I think it's only fair for reviewers to say "Ok then, let's look at what the latency is with reflex enabled, as we now have that option in every game where we can look at DLSS3".

Now in the future when Hardware Unboxed eventually does their 4060/4070 video for example, and they have a bunch of mainstream games where DLSS3 is available (which will likely be the case), and especially in situations where the framerate at a certain res is around 60fps - and don't mention reflex when it provides a significant benefit in a review involving AMD? Sure, that would be bullshit, and they should be called out for it - I sure as shit want to know if card A is getting 40% worse latency in 50% of the games tested vs card B in a GPU review.

My argument is I just don't think the amount of games currently with reflex support, and especially in games where people play them at a resolution/frame rate where reflex would actually make a noticeable difference, are prevalent enough to expect it to be factored into GPU reviews, which is likely why this haven't been by done by anyone until now. There's the potential for HB to be massive hypocrites going forward, I just don't think the lack of focus on this before now is really damning when no one in this space has been doing it.

Edit2: Battlenonsense is an absolute legend and it's a shame his videos didn't blow up. His content is way more detailed and higher quality than a lot of the benchmark farmers. I started watching him because of his netcode analysis of Battlefield 4, which moves on to netcode analysis of other games, explanations of bufferbloat and mitigations, and then moving on to input lag from games and peripherals. When he made the video about capping frames, the only time I'd used a framerate limiter was to keep the laptop I was playing Overwatch on from overheating. Once I got a real PC, I was playing uncapped with vsync off to get "the least lag" as everyone said. It was the common knowledge for pc gamers to do this, and it was entirely wrong until battlenonsense actually exposed it (at least I'd never seen anyone else make a video about it). It's a shame he's retired from making videos, because he was a perfect example of objective scientific inquiry and professionalism when it came to gaming.

Oh no doubt, I've linked to his videos often whenever I see the 'You have gsync, turn vsync off!' recommendation for years.

As for why he didn't 'blow up', that's likely a combination of limited content; his videos are extremely thorough and this resulted in more sparse output than other channels. That, and content largely focusing on high-refresh rate PC gamers with competitive shooters is just going to have a more limited reach vs more generalized tech/gamers channels. I mean sure, a lot of people play these games - but that's also across a wide variety of platforms that don't even have the option to tinker with the aspects he covers.
 
@Flappy Pannus Optimum Tech talks about latency more than other reviewers. He event built a battlenonsense style rig to measure motion latency for the mice he reviews. He talks about latency with monitors etc. He's also a masters level apex player, so his opinions about that kind of stuff carries some weight. He even made a video about strategies for optimizing latency pretty recently.


I'm also not really calling out Hardware Unboxed. They can do what they want. I just think there's context missing from their discussions about latency that would be beneficial for their audience. I don't think there's anything inaccurate about their video. The same applies for any other review site that's done the same thing.

As for reflex support being prevalent enough to be included as a talking point in reviews, it's not about the number of titles, it's the titles that have it. The combined player based of Call of Duty, Overwatch, Fortnite, Apex Legends, Destiny 2 and whatever is huge. COD is a pretty graphically demanding game too. Even Apex, Overwatch 2 can stress out gpus with the settings turned up. I think the bigger axe I have to grind with reviewers is they generally don't review games that are popular and have large player bases. They benchmark games that are easy to benchmark, so you see Tomb Raider endlessly because it "scales." They turn old games into synthetic benchmarks.

Getting too off topic here, I think, my overall problem with how the DLSS3 talk is being handled is that it's being handled in isolation. For example they could give the impression that you should turn off DLSS3 frame generation because it will increase latency to an unacceptable level, and their audience is seeing that and thinking it's a junk feature or a gimmick, but meanwhile the audience is happily playing games that don't support reflex with their gpus pegged at 99% utilization and more input lag than any DLSS3 game would have. Or maybe they're playing on an AMD card with the gpu at 99% in all their games and they're 100% happy with it because it doesn't feel bad to them. All of the points Hardware Unboxed raised were valid, and I don't think there was anything inaccurate. I just wish there was more of a focus on things outside avg fps and 1% lows in general.
 
@Flappy Pannus Optimum Tech talks about latency more than other reviewers. He event built a battlenonsense style rig to measure motion latency for the mice he reviews. He talks about latency with monitors etc. He's also a masters level apex player, so his opinions about that kind of stuff carries some weight. He even made a video about strategies for optimizing latency pretty recently.

Ah ok fair enough, I mainly know him for his SFF rigs reviews. In the context of this being done in GPU vendor comparisons though, it actually ends up supporting my point - his last GPU review before the 4090 was his video on The fastest GPU's under $600. Latency numbers? Not mentioned at all.

So even when titles like Apex Legends and Overwatch are benchmarked, he doesn't feel Reflex is worth any distinction. Incorrect? Maybe! Like I said though, this is just not common in GPU comparisons - to the point where even a guy that has latency as a prime concern ends up showing the 6700 xt as the best value per your FPS dollar (relegating RTX and DLSS to 'if you're into that kind of thing' as an addendum at the review helps that).

As for reflex support being prevalent enough to be included as a talking point in reviews, it's not about the number of titles, it's the titles that have it. The combined player based of Call of Duty, Overwatch, Fortnite, Apex Legends, Destiny 2 and whatever is huge. COD is a pretty graphically demanding game too. Even Apex, Overwatch 2 can stress out gpus with the settings turned up. I think the bigger axe I have to grind with reviewers is they generally don't review games that are popular and have large player bases. They benchmark games that are easy to benchmark, so you see Tomb Raider endlessly because it "scales." They turn old games into synthetic benchmarks.

Well yes, I mentioned this - that's one of the reasons you don't see latency measurements in GPU reviews. Hence why DLSS3 may change this, as it makes the likelihood of these types of games that are benchmarked having Reflex far more common, and as part of measuring latency when they benchmark the DLSS3 portion, they had certainly better compare that to the same game on Radeon without it.

Also, while those games can all stress GPU's to some extent (COD certainly more than others), they also scale remarkably well downwards too, far better than most games, which is large part of the reason they have such large playerbases. As other latency benchmarks I've linked to have shown, the latency advantage reflex can provide in these games can be quite minor if you're already at high frame rates without vsync, which is an option these games provide to a far wider hardware base than something like the big budget games that aren't competitive shooters. I mean if you told an Overwatch player "Hey at 60hz Reflex will reduce your latency by 40%", don't be surprised if they respond with "Who the fuck plays Overwatch at 60hz on PC!?"

All of the points Hardware Unboxed raised were valid, and I don't think there was anything inaccurate. I just wish there was more of a focus on things outside avg fps and 1% lows in general.

I brought something similar up wrt shader compilation stutter. Linus did a short video in partnership with DF recently on this, my critique at the time was why did it take Alex hammering on this for a year + before a site with a huge audience that is primarily PC gamers to not bring this up before? Why isn't Linus gathering developers to comment on this? Latency, the frametime consistency of various framerate limiter methods, using DXVK to help older games, all interesting topics that can have a significant impact on the PC gaming experience that these sites rarely cover.
 
Last edited:
I brought something similar up wrt shader compilation stutter. Linus did a short video in partnership with DF recently on this, my critique at the time was why did it take Alex hammering on this for a year + before a site with a huge audience that is primarily PC gamers to not bring this up before? Why isn't Linus gathering developers to comment on this? Latency, the frametime consistency of various framerate limiter methods, using DXVK to help older games, all interesting topics that can have a significant impact on the PC gaming experience that these sites rarely cover.
It is baffling to me at times that the PC industry has so many covering "game performance" with GPUs and CPUs and very few if any are talking about an endemic problem in PC games which greatly affects their presentation, playability, and performance: Shader Compilation stutter. It feels like I am the only one really. Perhaps it has to do with the fact that for my reviews I *actually play* the games extensively.

It feels almost like the "experience of playing games" is not what is important, rather other things are? IDK bar charts?

This does not have much to do with DLSS3. But tertiarily, I would say that there historically can be a massive swing that occurs in how coverage is done due to "scandalisation", as I will call it. A good example is what happened with FCAT. Everything was about the bar charts, then FCAT comes out (which we still use btw as it is the most reliable way to judge frame performance on the display!) and for a brief time the entire focus of reviews switches to covering frame health. But then it disappears and we return to bar charts for a long time (to this day).

There was a brief moment of time when game experience and presentation actually was center stage in PC reviews, so why did it go away? I am not sure. Any theories?

Perhaps scandalisation can also happen with regard to input latency. I do not think DLSS 3 will start it at the moment as it is "optional" and Nvidia only, so people are comparing against NV only solutions. But I can imagine how scandalisation can occur with DLSS3. For example, If one were to measure in a game that DLSS 3 FG latency is the same or better than a competitor GPU due to reflex, or that the reflex latency in a game was like half that of a competitor GPU. Something like that. That sounds like a topic right for "scandalisation" which could briefly cause PC review outlets to start focusing on pc latency for a while... but I do doubt for very long, as the eternal call of bar charts is there.
 
Last edited:
Someone has posted the video from Digitalfoundry, here is the link again:

Reflex is cutting latency in half in God of War and Destiny. Hardwareunboxed literally said that every game on an AMD card is only playable at over 100FPS. I hope they will go forward with "latency first, smoothness second" in the future. Will be very interesting to see how their audience will accept it.
 
Just watched the Alex vidéo on DLSS3, congratulations for the awesome work.

It bother me than, you never know what the results will be. Some sequences seems fines, others have weird behaviour/artefacts. I don't want that when I play a game :/ Maybe with time nvidia can improve on that, but given we still have ghosting&co in regular dlss, I doubt it.

It's weird, I find the tech incredible, but not good enough at the same time.
 
Just watched the Alex vidéo on DLSS3, congratulations for the awesome work.

It bother me than, you never know what the results will be. Some sequences seems fines, others have weird behaviour/artefacts. I don't want that when I play a game :/ Maybe with time nvidia can improve on that, but given we still have ghosting&co in regular dlss, I doubt it.

It's weird, I find the tech incredible, but not good enough at the same time.

I guess that's common when doing real time 3D graphics. For example, shadow maps, especially at lower resolution, can have a lot of artifacts (shadow appear where it shouldn't be), but games still use them because they are good enough for most situation. If we accept only the "perfect" solutions we'd still be rendering texture-less triangles, as even Z buffer has artifacts in some situation.

In the end, some techniques won and becomes common place and others fell into obscurity (e.g. stencil shadow in Doom 3 used to be a thing). Obviously we can't predict the future but sometimes it can be quite easy to see which one is likely to be popular in the future.
 
Bingo. Hardwareunboxed bias put them into a corner they cant really escape anymore.

/edit: Latency in Spider-Man is the same between DLSS3 and native 4k with Reflex:

I played yesterday and cant feel a difference between FG on/off. Even with downsampling, DLAA and 50FPS nativ latency feels the same. The difference in smoothness on the other hand is cleary visible.

/edit: This is from their website:
2546-dlss-3
3-p_1100.webp


DLSS performance has only 33% lower latency while producing over 2x the frames. Going by their opinion DLSS performance is generating fake frames, too...
 
Last edited:
Back
Top