Reflex has been supported in Call of Duty(four iterations), Apex Legends, Fortnite, Overwatch and Valorant. The player base for those games is enormous. There are more games all the time, like God of War.
Sure, the competitive shooters do - but those make up a tiny fraction of most GPU benchmark suites used in reviews, largely because they're so easy on the GPU and are better used for CPU benchmarks. The fact of the matter is before now, its penetration is very small across the wide range of games people play, and especially across the graphically demanding games in GPU reviews.
I don't think reviewers are biased in a malicious way, but the games they select and how they test them reflect their biases. They don't test latency.
Before DLSS3 coverage though,
who does this as standard part of their GPU reviews? Can you point to an established site where this is commonplace?
There are definitely separate videos on reflex as a tech, but even Digital Foundry hasn't made this a focus, I guess as long as you don't count
literal Nvidia sponsored videos. Like I said, chances are the majority of games used in a benchmark suite don't support Reflex regardless, so it's not even an option.
There are DLSS3 titles that include manadatory reflex support, and suddenly 10 or 20ms matters, even though native without reflex is how they do all of their benchmarks. It's just weird. It's now how they approach gpu reviews or recommendations. In every game that supports reflex, the Nvidia gpus are going to win in terms of latency when the game is gpu limited, regardless of whether your native, DLSS or FSR. The best option on AMD and Intel gpus is to cap your frame rate or lower your settings so you stay below maybe 97% utilization.
I think HU is overplaying the latency concern on the 4090 with the framerates they're targeting, sure. Otoh, I think you may be overplaying Reflex's benefit as well, to where you think it's egregious bias if latency hasn't been a standard concern for GPU reviewers.
A significant point of reviews, if not their
only point, is to determine "
Does this product in actual use validate the claims of its manufacturer?"
Latency is being brought up now, because
Nvidia is highlighting it as part of their Ada/DLSS3 marketing. It is being brought up now, because Nvidia, through their performance % graphs for Ada, are effectively saying "These performance improvements with DLSS3 are directly comparable to the previous gen". HU's angle seems to be largely focused on that argument - that this is
not the equivalent of the benefits you get from pre-DLSS3 technologies at the same frame rate. They see the situations where DLSS3 can benefit, and are also extrapolating that to note their concerns with how this will effectively downscale to the product class that most people will actually be able to afford (?). That I think is where part of their focus on latency is coming from as well - that it's kind-of a problem now (my words, again I think they're overstressing it in the context of the 4090), but may be far more significant when it will inevitably be marketed as a massive performance-enhancing feature for the lower-end cards.
Premature? Yeah, perhaps - by the time the 4060/4070 come out maybe some of these drawbacks will be minimized further with updates, that's possible. I think part of the disagreement here though is that HU is approaching this primarily as a critique of the feature as to what it brings to the value proposition of a
product line, not necessarily as the technology itself. It's difficult as its benefits will ebb and flow depending on so many factors - the latency of the game to begin with, the starting frame rate, the type of motion, etc.
I definitely think DF is a better resource to get a handle on what exactly the tech is doing, and I think that's true in many cases compared to the DIY PC tech sites. But they have different approaches, GN/HB come at things more from a consumer value perspective and are more of a 'what are you doing for me
now', I think that is a part of valid coverage too.