So long as the hardware at home can be made powerful enough to render the game. However, if you need $4,000 of hardware to make next-gen games in 2030, say, $4000 consoles won't sell whereas a $4000 server serving 8 different users will. Most of the time your console is powered off; in a server, it could be powered on all the time and time-shared, requiring only one box for multiple different people to game through the week.
Streaming is a possible solution to the end of Moore's Law. Hardware needs to get bigger and more power consuming to improve. That is most efficient if housed in bespoke server farms with energy recycling and distributed users. It's almost inevitably the future pending some complete tech paradigm shift. In the past, streamed applications on thin clients were hampered by weak communications where local power was plentiful. That's now inverted with comms progressing faster than power, or if not now, in the future.
I guess that'll be what happens for the transition. You will be able to buy a console, but it'll cost $800 up front whereas streaming will be a $25 a month. So more people will stream for cost reasons, and it'll be market dynamics that result in the transition. The following gen, the hardware will be $1200 for home gaming, or stream.
If you want to counter this, you need a model of home hardware that's affordable and also suitably powerful. Next-gen we're looking at ML 'hacks' to get more progress than the raw silicon advancements provide. Where do you go the gen after that where ML enhancements are already in use? If the only route to progress is lots more silicon, how do you provide that affordably?