It's still using RT even if you don't think it's using it enough, while you claimed there weren't any RT games in those non-DLSS3 benhcmarks.It's using it in a way where it manages to run "fine" on AMD h/w which means that it is likely shading bound even on Ampere which in turn means that whatever advantages Ada has in RT won't show up there.
We don't know if either one is bandwidth starved in those benchmarks. Having less bandwidth doesn't mean anything if you're being held back by something else.How can we assume that a 192 bit GPU will be more bandwidth starved than a 384 bit one? I mean there are no precedent of such issues in 4K on GPUs with a large LLC, right? (sigh)
And you're calling me a liar? I've never claimed anything like that.Really. Because all your posts here are colored by your personal bias to a point where an obvious lie is preferable to you - like the one where 4080/12 is somehow slower than a 3090 in all games shown thus far.
I've only disputed your claim of 4080/12 being considerably faster than 3090, based on 4080/12 being slower than 3090 Ti in NVIDIAs own picked benchmarks without DLSS3, and the small performance difference between 3090 and 3090 Ti.