For quite a long time, nearly any game on UE4.
I get you're not big on specifics, but try it sometime.
Yup. You've abstracted away differences in the hardware and expect similar results from somewhat disparate shader workloads.
It's a question of what is a reasonable expectation. I think it's certainly possible (perhaps even desirable) across different GPU's with different architectures, for games to have varying performance across GPU's in the same class, obviously if they were identical then you would flip a coin to decide your next purchase. It's the
degree of this difference which is the outlier now though, with the rendering load being equal. Being an 'outlier' doesn't imply fraudulent/artificial mind you, at least not in my eyes - there could very well be some significant architectural advantages of RDNA that just haven't been utilized that well due to market conditions without AMD's assistance, sure.
However, it is also perfectly reasonable for gamers to see this particular level of performance discrepancy, after
years of comparable data points between these GPU families, and wonder 'uh...wtf?!'. If we had more examples of this kind of
chasm, then sure - you could say "Eh, different architectures, deal with it duh" - but we just don't. If so, please point me towards those games?
For example, when it was released, AMD had a clear edge in AC: Valhalla. I routinely saw people wonder what was up that title on Nvidia GPU's, and some Nvidia fanboys complained when it was included in benchmarks as it would 'unfairly' skew results to Radeon (which was silly, it was the latest release in a huge franchise). Eventually, driver updates from Nvidia negated much of that earlier lead. This 'large' AMD advantage at launch though? Well:
The 6800XT had a 15% advantage at 1440p over the 3080, and was all but identical at 4k.
That was what defined an "AMD optimized" title in years past. 30-40%? This is decidedly new territory in rasterized titles, it's understandable for people to see that and be surprised.