It's worse than that. Nvidia code only runs well on their latest GPU. Their own previous GPUs run poorly too. Just see Pascal in the majority of titles since Turing released.
If you are familiar with GPU architectures you know that has nothing to do with Nvidia forcefully reducing the performance of Pascal GPUs. Turing is a much more modern architecture that has support for FP/INT operations simultanously, HW-accelerated Raytracing, DX12 Ultimate support, INT8/4 acceleration and DirectStorage, games slowly tap into the potential of these modern GPUs as we progress into the generation. So expect that gap between Turing and Pascal widen far more in the future.
Pascal is just an outdated architecture from 2016. Likewise, the same applies to RDNA1 which doesn't have that featureset as well. I expect that even a regular RTX 2060 will outperform the 5700XT and 1080Ti by a large margin both in performance and visual quality in next generation games in around 2-3 years when the full set of DX12 Ultimate is used and Raytracing becomes the standard. That is, if games even boot up on the 5700XT anymore, I'm a little skeptical considering AMD still has not enabled DXR for it yet.
I also have high hopes for RDNA2, I think it will age gracefully. Sure, RT performance is lower but I'm pretty certain future games will implement more modest Raytracing at lower quality settings to help with performance, as we seen in the console version of Watch Dogs Legion. The consoles will help a ton here.