This is what always happens. Every card gets tested at the same settings, so max settings regardless if its practical or not.
Well in this particular case though they're not even doing that - they're adjusting the resolution as well as the DLSS quality for each tier of card. It's a very strange way to demonstrate scalability by only adjusting those but keeping things like max RT on.
This is why I'm not to keen on these performance destroying ultra options. It makes people without technical knowledge think that the game straight up cannot run well with RT on lower end cards. And people like Steve from Hardware Unboxed parrot that thought without questioning and then come to the conclusion that "everything below 3080 is not suitable for RT" or that "the 2060 Super has no advantage to the 5700XT because RT is unuseable anyway." And of course, their community will then parrot that as well without a second thought.
It really annoys me.
Yeah, I've raised this concern before as well. It's great to have scalability in PC games, it's one of the key advantages of the platform after all. I
want to have Ultra options that can deliver a better experience when I get new hardware and revisit it without begging devs to provide an upgrade patch to take advantage of your new hardware.
But developers need to actually provide a meaningful visual benefit to warrant the performance cost. Just upping the precision for the sake of having an 'Ultra' setting that doesn't actually provide tangible visual differences other than halving your frame rate is ultimately just giving your game unnecessarily bad PR.
God of War is a good example. On my 3060 at 4k, I have to run with original settings, aside from textures/aniso at max, to maintain 60fps (mostly) at DLSS balanced. I'll likely upgrade my GPU in the coming year, which will allow me to run at Ultra settings. But here, they actually mean something - Ultra shadows produce a very tangible upgrade, to the point where they can significantly alter the lighting of entire scenes. They're costly, but you see why. The game doesn't look like shit without them, but it looks so much better with them. That's scalability. Ultra means something here other than just a word representing a numerical value.
I would defy anyone to tell the difference between RT reflections and Ultra RT reflections in Sackboy without
very zoomed in, side-by-side comparisons - and even then it's difficult. Again, give me those system-destroying
options - but you have to actually make the advantages they bring visible. If you can't, then I see little point to include them. Like really, benchmarking Sackboy with 200% res scaling would actually make more sense from the perspective of using a setting that actually provided a visual benefit.
(Also drawing attention to these meaningless Ultra settings draws attention away from other performance aspects of the game which actually affect everyone regardless of GPU, such as the shader compilation stutters. Something that sites like Hardware Unboxed should have actually been talking about long before the 'problem' of the high cost of RT)