Xbox Business Update Podcast | Xbox Everywhere Direction Discussion

What will Xbox do

  • Player owned digital libraries now on cloud

    Votes: 3 23.1%
  • Multiplatform all exclusives to all platforms

    Votes: 3 23.1%
  • Multiplatform only select exclusive titles

    Votes: 8 61.5%
  • Surface hardware strategy

    Votes: 2 15.4%
  • 3rd party hardware strategy

    Votes: 2 15.4%
  • Mobile hardware strategy

    Votes: 1 7.7%
  • Slim Revision hardware strategy

    Votes: 1 7.7%
  • This will be a nothing burger

    Votes: 4 30.8%
  • *new* Xbox Games for Mobile Strategy

    Votes: 2 15.4%
  • *new* Executive leadership changes (ie: named leaders moves/exits/retires)

    Votes: 0 0.0%

  • Total voters
    13
  • Poll closed .
What if the video is streamed at something like, 15fps 1080p, and frame generated and upscaled to 4k60. But the input is polled higher, like 120hz and the game is simulated in the cloud at a higher rate as well, making input feel snappier.
I mean this sounds outright horrible and way overcomplicated.

The way people talk about cloud gaming makes it sound like a solution looking for a problem. Spending all this R and D on creating some perfect streaming parameters when it's just simpler to render the thing in the living room. It will never be better to stream a game compared to rendering it at home, the money one spends perfecting this solution ought to go towards things that matter.
 
The way people talk about cloud gaming makes it sound like a solution looking for a problem. Spending all this R and D on creating some perfect streaming parameters when it's just simpler to render the thing in the living room. It will never be better to stream a game compared to rendering it at home
So long as the hardware at home can be made powerful enough to render the game. However, if you need $4,000 of hardware to make next-gen games in 2030, say, $4000 consoles won't sell whereas a $4000 server serving 8 different users will. Most of the time your console is powered off; in a server, it could be powered on all the time and time-shared, requiring only one box for multiple different people to game through the week.

Streaming is a possible solution to the end of Moore's Law. Hardware needs to get bigger and more power consuming to improve. That is most efficient if housed in bespoke server farms with energy recycling and distributed users. It's almost inevitably the future pending some complete tech paradigm shift. In the past, streamed applications on thin clients were hampered by weak communications where local power was plentiful. That's now inverted with comms progressing faster than power, or if not now, in the future.

I guess that'll be what happens for the transition. You will be able to buy a console, but it'll cost $800 up front whereas streaming will be a $25 a month. So more people will stream for cost reasons, and it'll be market dynamics that result in the transition. The following gen, the hardware will be $1200 for home gaming, or stream.

If you want to counter this, you need a model of home hardware that's affordable and also suitably powerful. Next-gen we're looking at ML 'hacks' to get more progress than the raw silicon advancements provide. Where do you go the gen after that where ML enhancements are already in use? If the only route to progress is lots more silicon, how do you provide that affordably?
 
It will never be better to stream a game compared to rendering it at home, the money one spends perfecting this solution ought to go towards things that matter.
This is only true if your local hardware is good enough to achieve a better result. The whole premise of Geforce Now is you can stream games from an expensive PC to a Chromebook or tablet. Never is too strong of a work here.
 
So long as the hardware at home can be made powerful enough to render the game. However, if you need $4,000 of hardware to make next-gen games in 2030, say, $4000 consoles won't sell whereas a $4000 server serving 8 different users will. Most of the time your console is powered off; in a server, it could be powered on all the time and time-shared, requiring only one box for multiple different people to game through the week.

Streaming is a possible solution to the end of Moore's Law. Hardware needs to get bigger and more power consuming to improve. That is most efficient if housed in bespoke server farms with energy recycling and distributed users. It's almost inevitably the future pending some complete tech paradigm shift. In the past, streamed applications on thin clients were hampered by weak communications where local power was plentiful. That's now inverted with comms progressing faster than power, or if not now, in the future.

I guess that'll be what happens for the transition. You will be able to buy a console, but it'll cost $800 up front whereas streaming will be a $25 a month. So more people will stream for cost reasons, and it'll be market dynamics that result in the transition. The following gen, the hardware will be $1200 for home gaming, or stream.

If you want to counter this, you need a model of home hardware that's affordable and also suitably powerful. Next-gen we're looking at ML 'hacks' to get more progress than the raw silicon advancements provide. Where do you go the gen after that where ML enhancements are already in use? If the only route to progress is lots more silicon, how do you provide that affordably?
This is only true if your local hardware is good enough to achieve a better result. The whole premise of Geforce Now is you can stream games from an expensive PC to a Chromebook or tablet. Never is too strong of a work here.
I would rather graphics stagnate to where they are right now (or even 6 years ago) than stream all my games. Looking at what games people generally play it seems most would probably be in this camp.
 
I would rather graphics stagnate to where they are right now (or even 6 years ago) than stream all my games. Looking at what games people generally play it seems most would probably be in this camp.
What if you only played games 6 years ago that looked like they were from 6 years before that. You would be playing GTA V and The Last of Us while the rest of us are playing... GTA V Enhanced and The Last of us Part 1. Well, shit....
 
Streaming is likely the wave of the future, but several technical hurdles need to be ironed out first.

15 fps is unacceptable compared to 60 fps, but 120 fps is probably acceptable compared to 480 fps for 99% of gamers. It'll get to where the streaming is good enough.
 
Streaming is likely the wave of the future, but several technical hurdles need to be ironed out first.

15 fps is unacceptable compared to 60 fps, but 120 fps is probably acceptable compared to 480 fps for 99% of gamers. It'll get to where the streaming is good enough.
120 FPS streaming in 4K quality? Seriously? Maybe when MS launches some satellites into space and uses them to provide 1Gb internet speeds everywhere for a dedicated Xbox bundled with a parabolic dish. Welcome to Skynet!
 
Last edited:
Once again, the technology exists to bring latency very close to zero today. The capital costs to do it is significant however, and they won’t be moving to that model until it’s clear that consoles can’t continue.

5G specification enables edge computing which is designed for real time applications such as networked autonomous driving etc. it is also designed for video game streaming. Service Providers can also provide edge computing services, AWS, Azure, Cisco etc all offer this as well for specific industries that require edge computing but not over mobility.

It is costly to have to build edge computing and much cheaper to use your existing backhaul data centers at this time. But if the market were to expand or move, then edge computing investment is likely to occur to meet players where they are headed.

There may be only 1 more generation of consoles after this, afterwards the expectation is for everyone to move to cloud. By then edge computing facilities would have matured, the understanding of costs improved etc. for the next decade they should be collecting as much data as possible on player habits per area. Figure out all the intricate steps to actually running an effective streaming service.

Then when the time is right, deploy into edge compute.
 
Cloud gaming will happen when:

•mobile providers give everyone unlimited data without additional costs

•humanity figures out how to send data at a faster speed than light

•modems have a way to send signals to your device reliably when you are one wall away

•subscription costs go down (they are going up right now)

•energy costs go down

•phones get a way to attach a controller without it being awkward as hell

Which is a way to say, that it's not happening. If we want to talk about 30 years from now, then that becomes some flying cars argument. It's not needed and it will remain at 1% of the market for the foreseeable future.

Sony and Nintendo will make consoles for as long as it takes, it would be like Apple abandoning the iphone. It's not happening.
 
•mobile providers give everyone unlimited data without additional costs
That's happening albeit slowly for full adoption everywhere, but not that far into the future.
•humanity figures out how to send data at a faster speed than light
That doesn't seem neccessary. Edge Computing has a latency of 1ms, for video game streaming, round trip would be 2ms. That's already several times faster than your bluetooth controller to your console. Video games already have a latency of well over 30-100ms with frame generation. What is 2 more ms ?
•modems have a way to send signals to your device reliably when you are one wall away
As both wifi and mobile codecs improve, I'm not seeing this as a major issue. Hardline would solve this.
•subscription costs go down (they are going up right now)
Well so are hardware and gaming costs.
•energy costs go down
You'll be using less energy with streaming than everyone with a high powered at home console.
•phones get a way to attach a controller without it being awkward as hell
We're finding 5G modems built into premium laptops today, and that's going to be a mainstream feature in the next 5 years. We're now seeing cellular home networks, as not every single home can have high speed internet to home.

Clould streaming will be the norm when there are more people in which cloud streaming is more acceptable than the number of people feel it's not acceptable. Once you break that 50% mark, you lose half of your console buying base. How can you justifying the logistical cost of rolling out hardware and basing your business around that? This will happen much faster than you think.
 
That's happening albeit slowly for full adoption everywhere, but not that far into the future.

That doesn't seem neccessary. Edge Computing has a latency of 1ms, for video game streaming, round trip would be 2ms. That's already several times faster than your bluetooth controller to your console. Video games already have a latency of well over 30-100ms with frame generation. What is 2 more ms ?

As both wifi and mobile codecs improve, I'm not seeing this as a major issue. Hardline would solve this.

Well so are hardware and gaming costs.

You'll be using less energy with streaming than everyone with a high powered at home console.

We're finding 5G modems built into premium laptops today, and that's going to be a mainstream feature in the next 5 years. We're now seeing cellular home networks, as not every single home can have high speed internet to home.

Clould streaming will be the norm when there are more people in which cloud streaming is more acceptable than the number of people feel it's not acceptable. Once you break that 50% mark, you lose half of your console buying base. How can you justifying the logistical cost of rolling out hardware and basing your business around that? This will happen much faster than you think.
A huge amount of consumers with great connections have had the option to play cloud games for close to 15 years now. First with gaikai, then ps now, then stadia, GeForce now and Xbox cloud.

Local hardware isn't costly and it's accessible to everyone. The cloud is trying to solve a problem that doesn't exist, that's why it's struggling.
 
A huge amount of consumers with great connections have had the option to play cloud games for close to 15 years now. First with gaikai, then ps now, then stadia, GeForce now and Xbox cloud.

Local hardware isn't costly and it's accessible to everyone. The cloud is trying to solve a problem that doesn't exist, that's why it's struggling.
Cloud is currently resides in a backhaul datacenter which is designed to keep costs down. It's not designed for super fast latency and super high uninterruptable traffic. Cloud is in it's infancy, it needs years to compete with a platform as mature as consoles.
Local hardware is costly, have you seen the prices outside of first world countries? And didn't you see supply issues which led to scalping issues? Why hasn't the price of hardware come down in MSRP?

Cloud is solving the same problem that video streaming also solves, which is that when the quality of the streaming approaches near indistinguisable from local, you're not going to want local anymore. You ahve 4-10 screens per home, why would you want to be locked to just the living room? Why lock yourself to your house if you're in transit? There are hundreds of problems that cloud solves, the only one is hasn't solved is the quality issue.
 
Cloud gaming will happen when:
What about when local hardware isn't offering any improvement? The solutions for cloud gaming are workable. The solutions for the death of Moore's Law less so. Your reference to the past 15 years of streaming failure is misplaced as it ignores the significant differences between market conditions. Streaming did not thrive when hardware was cheap and capable, but that's not the future that cloud gaming will develop in (unless you know how local hardware is going to develop for PS7 and beyond).

Rather than argue against the shortcomings of cloud now, you need to argue the solutions for local hardware going forwards and how it will remain affordable and competitive.
 
What about when local hardware isn't offering any improvement? The solutions for cloud gaming are workable. The solutions for the death of Moore's Law less so. Your reference to the past 15 years of streaming failure is misplaced as it ignores the significant differences between market conditions. Streaming did not thrive when hardware was cheap and capable, but that's not the future that cloud gaming will develop in (unless you know how local hardware is going to develop for PS7 and beyond).

Rather than argue against the shortcomings of cloud now, you need to argue the solutions for local hardware going forwards and how it will remain affordable and competitive.
I'm not ignoring Moore's law, as it affects both local and cloud markets. You can use the power of how many GPU 's you want for streaming with a good system, it's still going to use so much energy that the costs just aren't feasible.

Your argument also ignores if that much power is even necessary. Most people play super lightweight games that run on phones... Unless low end hardware somehow disappears, cloud gaming will remain a plus, something that you can use alongside local hardware.
 
I'm not ignoring Moore's law, as it affects both local and cloud markets. You can use the power of how many GPU 's you want for streaming with a good system, it's still going to use so much energy that the costs just aren't feasible.
I don't think energy costs are part of the argument; I'm only looking as silicon costs. If energy costs will be prohibitive to run console-level hardware in an optimised server farm with energy recycling, it'll also render local consoles prohibitively expensive to run at home.
Your argument also ignores if that much power is even necessary. Most people play super lightweight games that run on phones... Unless low end hardware somehow disappears, cloud gaming will remain a plus, something that you can use alongside local hardware.
I don't understand this statement at all. We're clearly talking about the future of PC/console class gaming. People who are currently gaming on console and PC will presumably be interested in better games powered by better hardware in the future, no? They are the people who'll be migrating to cloud gaming if it's not possible to provide affordable, capable local hardware economically. Are you suggesting current PC+console gamers will lose interest in better games and will be happy to stick with PS6 level hardware and games forever?
 
I don't think energy costs are part of the argument; I'm only looking as silicon costs. If energy costs will be prohibitive to run console-level hardware in an optimised server farm with energy recycling, it'll also render local consoles prohibitively expensive to run at home.

I don't understand this statement at all. We're clearly talking about the future of PC/console class gaming. People who are currently gaming on console and PC will presumably be interested in better games powered by better hardware in the future, no? They are the people who'll be migrating to cloud gaming if it's not possible to provide affordable, capable local hardware economically. Are you suggesting current PC+console gamers will lose interest in better games and will be happy to stick with PS6 level hardware and games forever?
I mean, if improvements to local hardware stop because of costs and it becomes unfeasible to get better local hardware at consumer prices, are data center going to get magic hardware that somehow offers a better experience to the consumer for a lower cost? GeForce now ultimate costs 22€ a month here and that's already over the limit for mass market adoption.

Cloud providers will want profits, you are not going to get low prices for this type of service even 10 years from now.

Just like phones have stopped providing meaningful improvements, local hardware will reach that point too. People will still buy it. We have been in the age of slow tech advance for a while, and diminishing returns will not go away.
 
What about when local hardware isn't offering any improvement? The solutions for cloud gaming are workable. The solutions for the death of Moore's Law less so. Your reference to the past 15 years of streaming failure is misplaced as it ignores the significant differences between market conditions. Streaming did not thrive when hardware was cheap and capable, but that's not the future that cloud gaming will develop in (unless you know how local hardware is going to develop for PS7 and beyond).

Rather than argue against the shortcomings of cloud now, you need to argue the solutions for local hardware going forwards and how it will remain affordable and competitive.

Currently the gamers subsidize the console development and manufacturing costs by purchasing the console.

With cloud all that has to be paid up front and has to be clawed back via subscriptions.

There is a large segment of gamers who want the best experience their money can get them. They are spending $400-$700 on the hardware in the market. Sony has successfully continued to sell units at higher price points than the previous generation. The ps5 hasn't hit the $300 price point yet and they are running neck and neck with the ps4 which was at $300 by this point in its life. The ps5 pro is priced $300 more than the ps4 pro was.

The real question is if MS's path of moving to play anywhere will leave any gamers there to subsidize the costs of the streaming hardware. I think this upcoming generation will tell Ms's gaming future. Will both console hardware and streaming thrive for them. Will console hardware die for their streaming future ? Will both collapse in on each other ?

who knows Guess we will find out soon enough
 
I mean, if improvements to local hardware stop because of costs and it becomes unfeasible to get better local hardware at consumer prices, are data center going to get magic hardware that somehow offers a better experience to the consumer for a lower cost?
Yes, because the cost is distributed across multiple users. The cost of a console has to be paid for by one player, whereas the cost of a server can be paid for by multiple who time-share it.
GeForce now ultimate costs 22€ a month here and that's already over the limit for mass market adoption.
Because consoles are still affordable. If PS5 cost $1500, GeForce Now would be seeing a lot more interest. As console prices go up, streaming becomes more viable.

Just like phones have stopped providing meaningful improvements, local hardware will reach that point too. People will still buy it. We have been in the age of slow tech advance for a while, and diminishing returns will not go away.
So you are predicting a stagnant gaming space where hardware effectively plateaus with, what, PS6?
 
Back
Top