I like that they showed some of the edge cases where DLSS/FSR break down (albeit a minority considering there are far more in the games they tested), and not surprisingly it's usually involving post-process effects. The massive aliasing you get in the GoW cutscene with the vegetation in the background for example, is due to DoF being applied which is a low res effect that hasn't been compensated for with scaling. This is so often missed, DLSS performance can have issues with moire patterns sure, but more often than not it's where it scales disproportionately with post process effects that stand out.
It can be done. Metro Exodus & Deathloop are games I play at 4K/DLSS performance on my 3060, and they rarely suffer from these kinds of glaring artifacts that other games have. I'm sure there's a performance hit if you don't scale post process effects, maybe there should be a reconstruction-specific option in games going forward, but when not taken into account they really give a bad impression for reconstruction that can be remedied.
Minor quibble with the video as it's somewhat outside of its scope, but I would have liked to see the equivalent performance setting for traditional scaling in some of those games too. Show what DLSS actually brings to the table vs. older methods at the same performance target.
The main takeaway for me from this video is that a lot of games have bad aa and TAA is terrible like I've said multiple times.
With only one game (Death Stranding) where they found that DLSS provided a better image quality than native, so not sure how you came to that conclusion (and even then they didn't touch on motion blur or Dof in that game, where DLSS breaks down compared to native TAA). Their critiques focusing on image stability would be magnified substantially when using SMAA/MSAA, with the latter having a huge performance penalty in modern engines.
I mean you can test it right now - enable SMAA in Spiderman. It's just a sea of blinking pixels and moire patterns on skyscrapers as you swing by, there's just too much sub-pixel detail that those AA methods can't touch. You need the temporal component to deal with shader aliasing.
And it took them only three years to make this conclusion. Everyone knows that who has used DLSS 2 since January 2020...
DLSS only recently with DLSS 2.5+ made significant strides with ghosting, and as the video showed, there are still plenty of cases where it falls short of native. As they said, it can vary significantly per game, DLSS settings used, and output resolution. DLSS improved since 2020 quite a bit, the implementation in games improved too.
HB have definitely had some dumb takes with regards to DLSS, either not factoring it at all into GPU reviews or downplaying it (especially when comparing it to FSR 1), but it's simply not some truism that it's been a near-imperceptible, or superior method than a game's TAA for years. There are just too many variables in settings and implementation quality to imply that's some obvious fact that some people have just chosen to ignore. It depends.
They (deservedly) get a lot of critique on this front, but otoh I'm not aware of another channel which has devoted this much time to really examine DLSS/FSR2 in detail like they have recently. Yeah they should have done it sooner, but better late than never, especially if they back it up with actual evidence.