Server based game augmentations. The transition to cloud. Really possible?

I guess people are reading different articles than me, in reference to the cloud, they specifically stated 3x CPU and storage. No mention of GPU on the servers.
That's still a lot of general purpose computational power.

I would expect that a game using the cloud would have cloud side data (a cloud install) which may or may not differ from the game. Most of that data doesn't need to exist per user, so it can be really big potentially.

I'd like to see the experiments they've done, and I think it would be interesting to play with to see what you can really practically do with it. I wouldn't want to do those experiments while trying to ship a game, so I'd expect MS to provide at least some components that are relatively easy to integrate.

I would still like to understand who is paying for it. It could well be that there is a surplus of azure compute available at peak gaming hours, but I can't see that being enough.

Hmm now we are getting somewhere, if we have a game on local storage, with our classic assets and we have a copy in the cloud with much higher res assets i could see the cloud being able to get a limited amount of data from a game that has a fully dynamic scenery , use that data to calculate new static data and shift it back to the console. Sadly i have way to little insight into what is needed to actually make stuff like this work and if it is needed in any way. The only useful place i see cloud gaming and servers like this is worlds like MMO just way more advanced games with much dynamic content, but still need someone to explain to me how a cloud based rendering engine could do anything for my local graphics. It just sounds like SciFi.

Needless to say, games like this would require internet connections and would be dead without it which is one of my "no thanks" principles.
 
ERP said:
I wouldn't want to do those experiments while trying to ship a game, so I'd expect MS to provide at least some components that are relatively easy to integrate.

I would still like to understand who is paying for it. It could well be that there is a surplus of azure compute available at peak gaming hours, but I can't see that being enough.

These are indeed, and quite literally, the million dollar questions ...
 
So by now (with current gen) they would have 60 million times 3 the power of XBOX 360 in the cloud. Someone do a calculation on how much CPU power that would require. Not to mention the power bill.

Well of those 60 million how many do you think would be playing a title with heavy cloud compute at the same time? They are not even in the same time zone! I think you are safe to assume that at most only 10-30% of your user base will ever be playing cloud games at the exact same time.

In the first year of Xbox One they wouldn't even need the cpu compute of 5 million Xbox Ones in the cloud to match their claims. I think we are highly overestimating how much time is spent playing games per Xbox sold.
 
Maybe they felt selling consoles at a loss was stupid and they'd rather take the money they would have lost and invest it in something like the cloud instead.

I'm going to go back to my animate a flag in the wind example for a second. The way Microsoft's Orlean's setup seems to work is that you create a web app that is like a job system. Your client requests a job from the web app, which spawns a "grain". Grains are spawned as jobs come in so they can be distributed across multiple servers. So, back to the flag, the model could be something like you request the "flag animation" grain for each flag in your immediate area or scene. Some small data gets pushed to the cloud for that flag. That grain sits and runs until you tell it to stop, probably because you've moved into a different area of your game. All the while it is streaming updated data for the flag to you, but you are basically uploading nothing. It would essentially be asynchronous in nature. You just kept receiving data and fitting it into your rendering when you get it and reuse it until you get the next batch. Maybe the grain handles many flags. I don't know.

It all comes down to what jobs are latency-insensitive enough, and low bandwidth, that they can be pushed to the cloud. I don't know what kinds of jobs those would be. Animating the surface of large bodies of water where waves couldn't be disrupted by the player? They suggested fog. It's not like your body pushes as you move through it, so it is not interactive at all, and can probably be animated (if it moves) at a very slow rate. I have no idea. Someone else on this board should know better than me how these things are processed and how much data they require. The thing to remember is that the data can persist in the cloud as well as the computation, so it doesn't necessarily have to be an upload/download per-frame. It could be periodic upload with consistent download.
 
I don't understand this at all.
With the cash they'll supposedly invest for all those servers, they could have put a friggin flux capacitor on every X1. Which would have had instant and tangible results, instead of this unknown 'thing' they'll get out of the 'cloud'..
What exactly is the point?
 
Its unclear how much they have actually invested, looks like they already had the infrastructure for general cloud computing for business use. I believe the 300,000 servers quoted just the number of servers they have to provide the Azure service, not for Xbox Live. As an Xbox One user you will be just another paying customer of the infrastructure, they will most likely be making a profit from it and the extra business.
 
I would still like to understand who is paying for it. It could well be that there is a surplus of azure compute available at peak gaming hours, but I can't see that being enough.

My guess. Xbox One is $299 and requires a 2 year Xbox Live Gold contract at $15/month. After that you still have to pay $15/month to continue using your Xbox One, they just won't repo it anymore. Even if there is a more expensive option without the contract, there will be no way to use an Xbox One unless you are paying the new rate for Xbox Live Gold. (limited amnesty will be offered for people who have a number of months left on their memberships at Xbox One's introduction).
 
I don't understand this at all.
With the cash they'll supposedly invest for all those servers, they could have put a friggin flux capacitor on every X1. Which would have had instant and tangible results, instead of this unknown 'thing' they'll get out of the 'cloud'..
What exactly is the point?

Why would they do that when they plan on making every X1 act as a thin client for full cloud gaming in a few yrs anyhow? They'd have to build those server farms and invest just as much in the tech then. Might as well do it and save money on hardware costs in the interim while getting a foothold on that kind of tech approach early. It's a boost to XBLG and the platform in meaningful ways by the sounds of it. They will need to build up some trust with consumers on this type of thing before going full cloud rendering/streaming later on.

This is far more well thought out than many ppl are giving them credit for. This isn't just marketing speak.
 
Maybe I'm not interpreting this correctly, but it sounds like you think you can store and apply event and simulation data to every unique player instance across billions of pseudrandom events, millions of different cities with different geography and history, hours to days to weeks of playtime, billions of sequences of actions, and this is a performance and simulation improvement.

Not exactly, I'm saying that in some instances the simulations might be able to take recycled data sets from previous computations and interpolate a given answer.

If the simulation of a city is coded in a highly parallel way it is possible to see how this data could used speed up calculations in some instances and in others the complete set might be available in which you simply need to represent to the new user. In a sense you would take the intermediate stored series and insert it at the branch where the new outcome needs to be derived.
 
Last edited by a moderator:
Why would they do that when they plan on making every X1 act as a thin client for full cloud gaming in a few yrs anyhow? They'd have to build those server farms and invest just as much in the tech then. Might as well do it and save money on hardware costs in the interim while getting a foothold on that kind of tech approach early. It's a boost to XBLG and the platform in meaningful ways by the sounds of it. They will need to build up some trust with consumers on this type of thing before going full cloud rendering/streaming later on.

This is far more well thought out than many ppl are giving them credit for. This isn't just marketing speak.

OK I admit i'm not a techie but how come no one has brought up the fact that MS made a studio who job isn't even to make game but instead their are there to help other studio 1st and 3rd party get their asset to the cloud.. I forgot there name but I think it's in the UK.. I could be wrong or read something incorrect i'll try to find a link.
 
That's a good point. Some MS ppl just said it was 3 times the power of X1. Did you have a link showing them specifying that it's CPU power they use in that loose language's comparison?

"We're provisioning for developers for every physical Xbox One we build, we're provisioning the CPU and storage equivalent of three Xbox Ones on the cloud," he said.

There you go.

http://www.oxm.co.uk/54748/xbox-one...e-equivalent-of-three-xbox-ones-in-the-cloud/

It was on the previous page.
 
OK I admit i'm not a techie but how come no one has brought up the fact that MS made a studio who job isn't even to make game but instead their are there to help other studio 1st and 3rd party get their asset to the cloud.. I forgot there name but I think it's in the UK.. I could be wrong or read something incorrect i'll try to find a link.

ok I should have looked first. I remembered it wrong I was thinking of lift London but they will be making cloud base games..
 
I guess people are reading different articles than me, in reference to the cloud, they specifically stated 3x CPU and storage. No mention of GPU on the servers.
That's still a lot of general purpose computational power.

I would expect that a game using the cloud would have cloud side data (a cloud install) which may or may not differ from the game. Most of that data doesn't need to exist per user, so it can be really big potentially.

I'd like to see the experiments they've done, and I think it would be interesting to play with to see what you can really practically do with it. I wouldn't want to do those experiments while trying to ship a game, so I'd expect MS to provide at least some components that are relatively easy to integrate.

Yes, I suspect Microsoft didn't have time to sort out the use cases. It's a new area, so they can learn by throwing the problem at the developers too. ^_^

I would still like to understand who is paying for it. It could well be that there is a surplus of azure compute available at peak gaming hours, but I can't see that being enough.

Apple spent about $1 billion per data center. Microsoft will need to invest in similar infrastructure if they want to compete with the likes of Google and Apple. The $$$ probably come from their existing cloud revenue, web properties, royalties from Android and Windows licenses. It is impossible to deploy 300,000 servers in a short time. They probably include existing servers in their farms. There is no sense in keeping 300,000 unused servers before the console is launched.

Internally, they will have a projection for subscriber growth and cost. The capacity can be used by other services anyway. They only need to allocate new servers when new games ship, and when subscribers grow in chunk. So it's something they can control. The servers are most likely virtual since they need to build up the capacity quickly.


It looks like they were forced to pre-announce this. I suspect Google and Apple are watching this closely (if they are not already doing this). The cloud infrastructure makes more sense for tablets and phones (e.g., Siri). I expect to see Google Maps being used in games too, same for Facebook graphs. I think those are easier way to sell a cloud gaming platform because the end users can relate better to them. Few people care about cloth simulation or extra server power.
 

Saw it right after asking. Doh! Thanks though. Appreciate it. :)




OK I admit i'm not a techie but how come no one has brought up the fact that MS made a studio who job isn't even to make game but instead their are there to help other studio 1st and 3rd party get their asset to the cloud.. I forgot there name but I think it's in the UK.. I could be wrong or read something incorrect i'll try to find a link.

Yeah good point there. I noted it but have been so busy chattering about the various aspects of the platform it slipped my mind. Ha!




Since Scott_Arm wanted to start small in discussing the technical feasibility in terms of sending the info to the client, I wanted to take the opposite approach. Evidently few are interested in discussing Scott's wavy flag scenario, so I thought maybe going to the other extreme may make discussion more interesting. :p

I'm playing Quantum break on X1. My character is located within some moderately sized region I'll call his 'sphere of influence'. Inside that, everything I can interact with is obviously dealt with locally. Outside that, say beyond some fluffy transition region perhaps, everything else is computed in the cloud. Obviously everything must be rendered locally too.

With that as the context, I look to my left and maybe a couple hundred feet away (outside my char's sphere of influence) I see this:

8782065639_6ea9b6dcec_o.gif


8788640920_2c06bed529_o.gif


That's essentially nothing but the playback of pre-computed, latency insensitive, physics based animations rendered locally when I look at the bridge within the game world. The assets are on my HDD as is, so all I am getting is a data stream scripting those assets how to animate presumably.

Doable? Why or why not? If it is, I think we may have something big here because clearly 'visuals' is far from just being about flops, especially for asynchronous, latency insensitive aspects. Why do real time cutscenes like those above look so much more convincing and believably realistic than typical in-game animations? Because of how they move. A relatively decent next gen rendering pipeline + incredibly realistic physics based animations could be a game changer imho. Believability in visuals is a lot more about movement at this point and less about tech graphics than ppl seem to imagine.

Obviously the big ticket question is likely to be data caps and how much data such animations (or light maps, or AI code, etc) need piped into the client to display the improved fidelity. Has anyone considered how Halo games send 'saved films' in this regard? They don't send any assets, just data scripting out a sequence of animations. Sure, they are simply a list of anims already on the disc, but still...maybe a starting point on that issue?
 
Saw it right after asking. Doh! Thanks though. Appreciate it. :)






Yeah good point there. I noted it but have been so busy chattering about the various aspects of the platform it slipped my mind. Ha!




Since Scott_Arm wanted to start small in discussing the technical feasibility in terms of sending the info to the client, I wanted to take the opposite approach. Evidently few are interested in discussing Scott's wavy flag scenario, so I thought maybe going to the other extreme may make discussion more interesting. :p

I'm playing Quantum break on X1. My character is located within some moderately sized region I'll call his 'sphere of influence'. Inside that, everything I can interact with is obviously dealt with locally. Outside that, say beyond some fluffy transition region perhaps, everything else is computed in the cloud. Obviously everything must be rendered locally too.

With that as the context, I look to my left and maybe a couple hundred feet away (outside my char's sphere of influence) I see this:

8782065639_6ea9b6dcec_o.gif


8788640920_2c06bed529_o.gif


That's essentially nothing but the playback of pre-computed, latency insensitive, physics based animations rendered locally when I look at the bridge within the game world. The assets are on my HDD as is, so all I am getting is a data stream scripting those assets how to animate presumably.

Doable? Why or why not? If it is, I think we may have something big here because clearly 'visuals' is far from just being about flops, especially for asynchronous, latency insensitive aspects. Why do real time cutscenes like those above look so much more convincing and believably realistic than typical in-game animations? Because of how they move. A relatively decent next gen rendering pipeline + incredibly realistic physics based animations could be a game changer imho. Believability in visuals is a lot more about movement at this point and less about tech graphics than ppl seem to imagine.

Obviously the big ticket question is likely to be data caps and how much data such animations (or light maps, or AI code, etc) need piped into the client to display the improved fidelity. Has anyone considered how Halo games send 'saved films' in this regard? They don't send any assets, just data scripting out a sequence of animations. Sure, they are simply a list of anims already on the disc, but still...maybe a starting point on that issue?

It is possible but developers will have to do double work. Once without the server, and once with the server. Why not just do an exquisite offline render ? Or polish/optimize the local real-time implementation ? Otherwise, may be only the cloud subscribers can see the nice effect ?

We are hearing some challenges in early XB1 development. The developers may have to solve some teething issues to keep it on par with PC and PS4 first before looking at these nice-to-haves.
 
It is impossible to deploy 300,000 servers in a short time. They probably include existing servers in their farms. There is no sense in keeping 300,000 unused servers before the console is launched.

.

They have a pretty fast deployment system


they have made it clear this week (yea really ;) - well sounded like it at engineer talk) that the 300k are newly deployed and for LIVE and will be on line later his year. and yes they may share some

worth watching this as well especially from about 3 minutes in when they show their Gen 3 container facility and then Gen 4

 
It is possible but developers will have to do double work. Once without the server, and once with the server. Why not just do an exquisite offline render ? Or polish/optimize the local real-time implementation ? Otherwise, may be only the cloud subscribers can see the nice effect ?

We are hearing some challenges in early XB1 development. The developers may have to solve some teething issues to keep it on par with PC and PS4 first before looking at these nice-to-haves.

They'd need to have a good physics engine to handle stuff within the sphere of influence at that fidelity, sure. But that region can be relatively small depending on the game's design. And I'm envisioning exclusive games that REQUIRE the cloud for simplicity here. So no worries about stuff that has to be ported. Wanna start big (opposite of Scott's approach to building a model for conceptualizing things). :D

My thesis is that at this point in time realistic movement, physics, and animation is a significantly more visually convincing payoff than any amount of GPU power short of path tracing, at least for any relatively next gen rendering engine.

Any ideas how to consider the data flow though? That is likely the bottleneck I'd imagine. How much data flow is actually needed for local assets and local rendering to playback these scripts back? Remember though, player's can wait for a bit if the data takes a bit to all get through. Still, some guesstimates would be a nice start. Any devs out there care to take a stab at answering?
 
They have a pretty fast deployment system


they have made it clear this week (yea really ;) - well sounded like it at engineer talk) that the 300k are newly deployed and for LIVE and will be on line later his year. and yes they may share some

worth watching this as well especially from about 3 minutes in when they show their Gen 3 container facility and then Gen 4


300K servers later this year. I thought they are already here or close enough, which is why I think it makes no sense and should be virtual.

It won't be just one container. They will need to cover other countries too.
 
Back
Top