With no high end PC games, gamers have grown accustomed during this past decade to maxing out the graphics settings and still get high framerates at high resolutions. When an actual high-end feature shows up they are shocked by the performance cost.
That's a rather condescending and ignorant way to look at things. While the PC gaming space isn't as extensive as it once was at the high end of AAA development, there are still developers that take advantage of advances in GPU technology.
Ashes of the Signularity took advantage of Async compute and explicit multi-adapter (multiple GPUS), for example.
iD pushed new advances in rendering on both the PC and consoles.
Dice continue to look at new ways to leverage advances in GPU technology and have for the past decade. RT isn't the first time DICE have done this and it likely won't be the last.
Nixxes often pushes new GPU technology in their PC ports of games (Tomb Raider series, for example). We have some threads in the PC forum that talk about some of PC only GPU features that was pushed in the PC version of the game. Sometimes before any other developer.
Square Enix-Eidos pushed and experimented with many new GPU features in the Hitman games.
Etc. etc.
Just because it wasn't RT doesn't mean they didn't push or use the latest advances in GPU tech.
Just because you particularly like RT, doesn't mean that it is the only significant GPU advancement in the past 10+ years, or the only one that developers are or have been excited for, or the only one that might or has been used in games.
It's significant, but excitement and tentative exploration of a feature doesn't lead to a paradigm shift unless the entire industry moves to it.
There have been multiple technologies that could have lead to a paradigm shift and early on looked like they might lead to one, but after multiple years it ends up that it didn't. GPU Physics and Tesselation to name just a few.
Some are still in the process and could potentially falter still. Compute has actually lead to a paradigm shift, but a subtle one. GPU dispatch is another that is still in the process of potentially becoming a paradigm shift.
People can talk about paradigm shifts all they want, but it isn't one until it actually happens.
This isn't unlike Chemistry and Physics. I know a Nobel prize winning research physicist that has at multiple times over the years proclaimed something cool as a paradigm shift in Physics...some of those are still in the process and may or may not become one...some of them never left the theoretical stage.
All that said. I've already stated that I think it's likely to be a shift, but it hasn't happened yet and it likely won't for a few years still...assuming it does lead to one.
Regards,
SB