Developers like RT because it eases their development. That does not mean that it is actually practical for the end user at this point in time with the current hardware.
Gamers that think RT is a practical feature at this point in time are either content with 30fps, are simply blind nVidia followers or are unknowledgeable and fall for marketing. Just to copy one person with an actual RTX card;
"I've got a 3080, I tried the whole DLSS and RT, play with it enabled in Cyberpunk BUT at 1400p I still can't always maintain 60fps with DLSS set to quality, I would not use any other setting because you can absolutely tell the difference so for me RT is a bust atm. I paid £875 for my GPU and with the Quality setting in DLSS which basically renders the game at 1100p? I don't always get 60fps!! So if anyway here is buying a GPU that is not a 3080 don't bother with the RT because even DLSS wont save you"
https://www.techspot.com/community/...-41-game-benchmark.266962/page-3#post-1862014
So let us compare the performance to The Witcher 3 benchmarks five years ago:
https://www.computerbase.de/2015-06...dia-titan/5/#diagramm-the-witcher-3-2560-1440
The GT980TI for $649 (that is ~$700 today) was able to hit 43 FPS in 1440p. The 3080 with raytracing hits 38 FPS in 1440p in Cyberpunk: https://www.computerbase.de/2020-12...agramm-raytracing-in-cyberpunk-2077-2560-1440