I wouldn't put much value into current Cyberpunk 2077 benchmarks at the moment, much less as it being representative of what to expect in general PC performance between the two architectures.
It's a (deeply) nvidia-funded game, meaning they had access to the game's code much earlier than AMD for driver development, and in traditional manner nvidia got their own
trojan horses engineers on site to
skew performance on competing architectures enhance performance on their own architecture.
Most multiplatformers will be built for RDNA2 first due to consoles, so Cyberpunk isn't setting any trends.
Disclaimer: I already own the game and will love it nonetheless. I have nothing against it or CDPR. A Polish independent studio getting money and engineering resources from wherever they can to make a game that breaks the scale on production values is just doing what needs to be done. I would probably do the same under similar circumstances, as getting a paycheck is more important than
tech tyranny.
Regarding the various degrees of RT adoption, 1 year ago I'd be saying
RT is the future and everyone needs more RT performance yesterday if we want to survive.
Nowadays, I look at the most impressive real-time demo I saw the entire year and it's Unreal Engine 5 without a shred of RT. It looks like Voxel GI makes RT GI worthless to anything but RT benchmarks.
I look at the released game that IMO has the best graphics out there and it's Demon's Souls. It has no RT.
Cyberpunk looks great, but it seems to look great without RT, or with only a few RT effects enabled. It looks like nvidia
ordered suggested CDProjekt RED to just use as much RT as possible to make a showcase out of the game, and the result is a wholly unoptimised mess that doesn't really perform very well in any graphics card, and some effects could be traded by non-RT implementations with immense advantages in performance and residual difference in visual quality (which is already pretty much what always happens when we set PC settings to Ultra, anyway).
Consoles are the current limiting factor. If it is more performant than RT on consoles than it will be leveraged more in games than RT on consoles. If it isn't more performant than RT on consoles then it won't be leveraged more in games. If it is used more on consoles, it'll become the defacto way to do X thing regardless of whether RT can do it better on PC.
Lower end PCs have been the limiting factor for High end PCs at least ever since the 8th-gens released in 2013, because it coincided with Intel getting increasingly relaxed on their CPUs and IGPs due to lack of competition.
7 years later and the XBone/PS4 are still running circles around the dreaded Gen9 GT2 IGP that is in most PCs nowadays. Or even the newer Iris Plus in Ice Lake.