Which is why the only thing that makes sense to me, is to launch genuine next-gen with RT in 2024 with full BC if you ever bother with RT hardware in console space. It does not make sense to do mid-gen updates with RT hardware in 2024 since games still need to run on RTLess hardware from 2020.
Will raytracing ever make sense for consoles?
The issue I have with this is that having some raytracing capability was never really the problem. The problem was to make use of it in conjunction with existing pipelines in an efficient enough manner.
The overwhelming trend in computing is, and has to be, efficiency. At a given budget in dollars and power, what are the best results that can be achieved? And if someone manages to produce similar results at lower cost, they win in the mass markets.
That a 2080Ti runs Battlefield V
at 1080p says a lot.
Ah, you say, but future generations will devote more hardware to the problem, increasing performance!
Sure that’s possible for a few generations more, but then those resources could be spent speeding up existing approaches, or you could choose to save power and money and thus reach more customers.
So the question then becomes what raytracing brings to the table that other approaches cannot do, and for the most part, that seems to deal with refraction (as opposed to reflections). Which, to be honest, just isn’t particularly important even in scenes where it exists, much less for game play. I have a pair of glasses lying in front of me on a table with small puddles of water, and looking at it I’m not even noticing refraction, much less capable of imagining a scenario in which this would bring something significant to my gameplay.
Before it can be demonstrated that raytracing can be bring
significant advances,
at comparable or better efficiency than existing and future alternative methods, it just isn’t competitive. I’m not sure those conditions will ever be fulfilled. Targeting low cost mass market devices just lowers the probability futher.