"Pathtracing" is the future simply because it's simple conceptually, just shoot rays. And yes, we're trying to match offline quality rendering, why wouldn't we? The tricky part is maintaining backwards compatibility, and forwards compatibility, and compatibility with whatever hybrid renderer is happening now, and doing so using the designs that have bene hyper refined over decades.
But that's just engineers, engineers want to do engineering, and rays are a conceptually scalable straightforward way to get to Avatar 2 in realtime. Companies need to appeal to customers primarily and customers want mobile, the Switch is the best selling Nintendo console ever.
How do we do both? Probably get rid of black boxes wherever possible. Who needs to deal with giant ubershader register usage in a pathtracer if you already pre-compiled all the materials down to your material parameters in a DXT compressed texture? That's a programmer trick, not a hardware one. Or who knows what acceleration structure is fastest to traverse and rebuild at the same time? Instead of relying on hardware let software figure it out, they can get better and better on the same hardware.
On PS4 we went from Killzone Shadowfall all the way to Forbidden West a huge leap. Give software access to as much as possible and just let them spend a decade+ improving how games look on the next generation of consoles. Yes I've heard "but hardware is faster" with anisotropic filtering being given as an example. Except
software just improved on that, just give software the opportunity, and maybe we'll get to mostly pathtracing on mobile hardware from 2026 by 2035.