How exactly does Nvidia tank performance on AMD GPUs?
The blame needs to go on the developers of the games.
NVIDIA will offer development support to developers, often working in the developers office to "optimize" for nvidia hardware. What affect their "optimizations" have on AMD hardware could vary I would guess.
Nvidia has a lot more money to offer developer support than AMD does.
I'm more of a lurker, but I figured I'd chime in on raytracing performance on AMD cards, specifically minecraft which is horrible on rdna2. Nvidia has spent a lot of time/money developing and reworking the render pipeline in minecraft to work well with RTX cards:
It's a good listen since it is an interview with 4 Nvidia developers who are working full time on minecraft rtx, especially if you listen to it from the perspective of "does it just work?", or are optimizations required to work with specific hardware architecture. You'll find that, even though it's pathtraced, there are certain things that needed to be done, and still need to be done to make it work best with RTX hardware. Obviously, the same would be true with RDNA2, though the optimizations are being done by NVIDIA staff, on Nvidia hardware, I think it's obvious why performance stinks on RDNA2.
Once it's out of beta, and perhaps even has a console release, I wonder if the console RDNA optimizations could be ported over to PC, and improve the performance.