I find the latency conversation around DLSS pretty weird, to be honest. I've never seen any of the youtuber reviewers care about latency before. I've never seen them compare latency when evaluating gpus in reviews. They have never considered latency as part of performance when reviewing, just average fps, 1% low and 0.1% low. This is a purely hypothetical scenario but say there's a game x and they're benchmarking a 3080 vs a 6800. Would you ever hear them say, the 6800 has 100 fps, and the 3080 has 90 fps, but the 3080 has nvidia reflex leading to 10ms lower latency, therefor it is performing better and receives our recommendation? Have you ever heard latency brought up in terms of relative performance of games? When people are talking about how well "optimized" games are, do they ever measure latency? The general perception is if it scales with hardware, and if the fps numbers are high, then it's "optimized" but you never see them measure latency. Some games have a lot more latency than others at any given frame rate.
Many people would be shocked to find out that the click latency, or the motion latency of mice can vary by as much as 20 ms, even with mice that are marketed as being for gaming. People switch mice all of the time and never notice. I think a g pro superlight is close to if not the best performer. If you show them a chart that shows the relative latency of peripherals suddenly they start replacing them, but if they've been gaming a long time they've probably switched from low latency to higher latency peripherals without knowing or caring. It's kind of where showing charts comparing latency with frame generation on or off can be tough, because you can see the difference between the numbers, but unless you use it you won't know if you can actually feel it. I'm not saying they can't, but it'll vary by person. On top of that you can lower latency on mice by increasing cpi.
I am particularly latency sensitive. I might actually notice the differences in frame generation, and not be willing to accept it. I'm very accustomed to gaming at 100+ fps, usually closer to 200 fps, and that was true even when I had mid range cards like the gtx 1060. What did I do? I lowered the settings. The idea that DLSS3 might not be viable on a 4050 or 4060 is weird to me, because you just lower the settings to hit 60fps and then add the frame generation. Some people would play at ultra settings on those gpus, some would not. It's just another tool for people to take advantage of.
Even being sensitive to latency there is probably a point where I stop being able to tell differences. I think it's probably around the 120 fps mark vs anything higher in the same game. I can definitely tell when my gpu hits 100% and latency starts piling up, even at high frame rates. That's why I'll frame limit all of my games if they don't support nvidia reflex.
One thing I need to see is comparisons of DLSS3 at 1440p and 1080p. Those are still the most common resolutions. Hopefully someone will test it when the 1080p 500Hz displays are launched, as well as the 1440p 360Hz displays.
Edit: One thing I'll add is they're weirdly making the argument that AMD gpus are vastly inferior. On the input latency graphs they show native vs native (reflex off). The reflex off is probably around where the AMD gpus would sit, assuming frame rates are similar, because they don't have a comparable technology. But they care so much that they've never made that comparison before. It's weird.
Edit: Some data on the difference in latency between similar class Nvidia, AMD gpus. I miss Battlenonsense's content.