Do these "innovations" matter in the long run if it becomes obsolete ?
You realise you are citing every argument in favour of Larrabee, right?
We had the same arguments over whether the consoles should add RTRT hardware or rely on software, pointing to exciting software solutions. Those software solutions are faster but can't compete, so RTRT hardware is totally necessary.
I don't think you can comprehend the profound impact of implementing tons of specialized hardware from a hardware complexity perspective. If things were truly that simple for hardware vendors then many of them wouldn't really bat an eye about spending the hardware logic on these features if their cost was trivial
Except the implementation isn't. If you have custom hardware and your rivals don't, it sits idle. I remember my mate buying a PC card in the 90s with 'hardware scrolling'. Never got used in anything. the PC space mostly supports the lowest common denominator. The cosole space will use everything available (we hope. Modern economics might bit cutting back on that).
And how does your argument fit with mobile vendors who add numerous accelerators and processors and functional blocks into their SOCs? Their need for performance and efficiency is greater than consoles and they have moved very much towards specialisation over generalisation.
The down to earth reality is that integrated logic is becoming more of a premium as time goes on, console vendors will feel the pinch even more since their entire business relies on delivering high value per cost ...
Which is achieved more with specialised silicon than general purpose. Every console comes with hardware dedicated to video compression because it's a lot more efficient than doing that in general purpose shaders. This line of yours is exactly the reason for more specialisation because you get more performance and efficiency from specalisation!
You might think it's unreasonable for console vendors to not desire improvement upon those areas but if that path involves doubling their die sizes or incur a significant hardware implementation cost then they might prefer to not give into the those black holes ...
They can't double their die size. They have a fixed size and either spend that budget on general purpose or more specialised functions. The more they spend on specialised hardware, the less flexible performance they have but the more targeted performance they have. A 300mm² GPU with nothing but shaders will render games better than one with 50 mm² dedicated to ML blocks, but the latter one will upscale better and
look better. Another 50mm² spent on RT hardware like triangle intersect and some BVH thing, that 300mm² all-shader GPU will lack all the RT fancies of the chip with RT acceleration.
Your argument is that five+ years later, devs won't be using that BVH method or that ML upscaling method and they'll want something else that uses plain compute, in the same way devs moved on to deferred rendering and left 360's eDRAM AA redundant. That's the argument that said don't put RT hardware in PS5/XBO because devs can do it in software, an argument shown not to pan out. RT hardware hasn't been replaced with software solutions and it won't, because it's a fundamental image construction method that we can't replace with hacks but want to able to actually use! Likewise, upscaling is not a fad. It's going to be essential to getting better framerates and resolutions where we can't just use more silicon to render faster and higher res, so some form of matrix blocks are going to be necessary. Not using matrix blocks and instead using shaders will be less efficient and achieve inferior results. The only counter to that truth is, "maybe someone will come up with an even better solution that runs on shaders and make those matrix blocks redundant." Even if true, and there's little reason to think so, that wouldn't take away from what those blocks would provide the consoles as they'd still be used!