Server based game augmentations. The transition to cloud. Really possible?

One question: do I understand you correctly ERP that Azure (=300.000 server cloud?) is used for lots of other services at the moment as well, not only by MS but others as well?

Associated questions:

- So how many of those 300.000 servers are available and allocated for the Xbox?

I think you miss understood. Azure does not equal 300,000 servers. Live! now has 300,000 servers (up from 3,000), likely built on their Azure platform.

That 300,000 number doesn't really give much indication of capacity though. But I can safely state its nothing close to a 1:1 CPU and definitely not 1:1 server relationship per user/game copy. Just to give an example, I've got a small UCS deployment I've been working on and in it there's 20x B230 M2s each with 2x E7-2870s and ~262 Gb (or Gib :p). That's 40 CPUs, 400 cores, 800 threads, and 5.2 TB of memory. Of course, this entire pool of resources is virtualized and clustered. High density deployments can scale pretty well. And that's just in the standard x86 blade realm, not HPC.

EDIT

Updated CPU, core and thread count (bad math).
 
Last edited by a moderator:
Siri does need an internet connection to work.


What about Xbox voice recognition? Is this what the cloud is being needed as well? Always on for voice recognition?

im sure there is a connected and unconnected state for xbox commands... I still believe that onstage act by Yusuf mehdi was staged.
 
...I'd rather say, WHY. What's the gain here? ...

Bottom line, this is a great point.

Until cloud processing is a huge selling point which is pushing beyond what a single console can do on it's own, it's a non starter. And to get the infrastructure built up to that point will take time and a lot of money.

The gain from the platform holder is obviously cashflow and drm control.

The gain from the consumer end has yet to be justified.

There will likely come a point when compute is so cheap that the infrastructure is able to provide rich interactive experiences which dwarf what is possible in a console, but FWICS, that point is many many years out.

In the meantime, it seems to be a marketing bullet point.
 
I think you miss understood. Azure does not equal 300,000 servers. Live! now has 300,000 servers (up from 3,000), likely built on their Azure platform.

That 300,000 number doesn't really give much indication of capacity though. But I can safely state its nothing close to a 1:1 CPU and definitely not 1:1 server relationship per user/game copy.

Hm, are you sure about this?

I wonder how they do this. You cannot build 300.000 server infrastructure over night, that is for sure. So this means that they already have big junks of this installed and working with more servers up and running over the next month. If they are just dedicated to Xbox Live and not integrated into a bigger framework like Azure...are they idling around all the time until X1 takes off? This scenario seems rather unrealistic to me. They currently only need 15.000 servers to run Live they said in the presentation, so this would be a dramatic waste of resources.

I am really interested in how they pull this off. 300.000 is such a large enterprise...just huuuuuuge!
 
I think you miss understood. Azure does not equal 300,000 servers. Live! now has 300,000 servers (up from 3,000), likely built on their Azure platform.

That 300,000 number doesn't really give much indication of capacity though. But I can safely state its nothing close to a 1:1 CPU and definitely not 1:1 server relationship per user/game copy. Just to give an example, I've got a small UCS deployment I've been working on and in it there's 20x B230 M2s each with 2x E7-2870s and ~262 Gb (or Gib :p). That's 80 CPUs, 160 threads, and 5.2 TB of memory. Of course, this entire pool of resources is clustered and virtualized. High density deployments can scale pretty well. And that's just in the standard x86 realm, not HPC.

How much did this server cost you?
 
Hm, are you sure about this?

I wonder how they do this. You cannot build 300.000 server infrastructure over night, that is for sure. So this means that they already have big junks of this installed and working with more servers up and running over the next month. If they are just dedicated to Xbox Live and not integrated into a bigger framework like Azure...are they idling around all the time until X1 takes off? This scenario seems rather unrealistic to me. They currently only need 15.000 servers to run Live they said in the presentation, so this would be a dramatic waste of resources.

I am really interested in how they pull this off. 300.000 is such a large enterprise...just huuuuuuge!

Anyone know the size of Azure? Is live currently constructed on Azure?
 
I think you miss understood. Azure does not equal 300,000 servers. Live! now has 300,000 servers (up from 3,000), likely built on their Azure platform.

That 300,000 number doesn't really give much indication of capacity though. But I can safely state its nothing close to a 1:1 CPU and definitely not 1:1 server relationship per user/game copy. Just to give an example, I've got a small UCS deployment I've been working on and in it there's 20x B230 M2s each with 2x E7-2870s and ~262 Gb (or Gib :p). That's 40 CPUs, 400 cores, 800 threads, and 5.2 TB of memory. Of course, this entire pool of resources is virtualized and clustered. High density deployments can scale pretty well. And that's just in the standard x86 blade realm, not HPC.

EDIT

Updated CPU, core and thread count (bad math).

I find it hard to believe they'd increase the server capacity of Live 10-fold in the hopes that people will leverage the servers for cloud computing. That's an incredibly expensive venture, and very risky. If no one uses it, who pays for it? How much is Xbox Live going to cost?

Apparently Xbox Live is already hosted on Azure. I think there really has been a huge expansion of the network. I wonder if when they say "servers", they are referring to VMs? So a jump from 3000(?) VMs to 300000 VMs.

Apparently in April they cut prices of cloud services, storage and compute to "match" Amazon, Google.
 
Hm, are you sure about this?

I wonder how they do this. You cannot build 300.000 server infrastructure over night, that is for sure. So this means that they already have big junks of this installed and working with more servers up and running over the next month. If they are just dedicated to Xbox Live and not integrated into a bigger framework like Azure...are they idling around all the time until X1 takes off? This scenario seems rather unrealistic to me. They currently only need 15.000 servers to run Live they said in the presentation, so this would be a dramatic waste of resources.

I am really interested in how they pull this off. 300.000 is such a large enterprise...just huuuuuuge!

First I think its critical they simply said 300,000 "servers." Just servers, no indication of physical or virtual. If you take a respectable 30:1 virtualization ratio you would need 10,000 physical host to handle those 300,000 servers (servers as in VMs, and that's very back of the napkin math). 10,000 hosts distributed globally isn't unreasonable. It's not a small project by any means, but its more than doable for a company with MS' infrastructure chops. They may also be counting "servers" differently based on some other internally standardized metric. Also, MS has modularized compute and storage into literal storage containers (partly for their "hybrid cloud" deployment and tests, IIRC). I'm not sure how they do it for their globally hosted services (Azure, O365, BPOS, etc) but they have a lot of knowledge and experience in this.

How much did this server cost you?

:LOL::LOL::LOL:

Technically, they're chassis/blade servers (just for clarification). And it didn't cost me anything, but it did cost [REDACTED] a pretty penny. I'd have to see if I can find the original invoice for this or a past customer's UCS deployment for an exact number. But off hand, its a drop in the bucket for the revenue generated and dependent on it. And as I said that's a relatively small deployment done in a traditional manner (centrally managed blades/chassis, in several racks in a datacenter. MS has other methods at their disposal for quick and modular expansion (I don't know what their Azure infrastructure looks like on the back-end, though, I'd have to dig around).

I find it hard to believe they'd increase the server capacity of Live 10-fold in the hopes that people will leverage the servers for cloud computing. That's an incredibly expensive venture, and very risky. If no one uses it, who pays for it? How much is Xbox Live going to cost?

I'm just going by what they announced at the reveal.

I wonder if when they say "servers", they are referring to VMs? So a jump from 3000(?) VMs to 300000 VMs.

Could be, or as I mentioned above some other internally standardized method of counting a "server" for the purpose of Live!.

EDIT

Missed this part...

Apparently in April they cut prices of cloud services, storage and compute to "match" Amazon, Google.

Indeed, I believe there's been 2 drops over the past couple of years. It's quite cost competitive now and with the incentives to partners, along with the ease of deployment, and various bundles it makes their 365 and Intune offerings a very compelling solution, IMO.
 
Last edited by a moderator:
:LOL::LOL::LOL:

Technically, they're chassis/blade servers (just for clarification). And it didn't cost me anything, but it did cost [REDACTED] a pretty penny. I'd have to see if I can find the original invoice for this or a past customer's UCS deployment for an exact number. But off hand, its a drop in the bucket for the revenue generated and dependent on it. And as I said that's a relatively small deployment done in a traditional manner (centrally managed blades/chassis, in several racks in a datacenter. MS has other methods at their disposal for quick and modular expansion (I don't know what their Azure looks like on the back-end, though, I'd have to dig around).

Yeah, server stuff is super expensive. Just currently looking for a mini server for about 8k Euros and recently got an offer for one with 256cores for over 35k Euros.

That is why I am so surprised about the number 300.000! I hope we get more details about this stuff...but there is so much undiscussed stuff about the cloud.
 
Hm, are you sure about this?

I wonder how they do this. You cannot build 300.000 server infrastructure over night, that is for sure. So this means that they already have big junks of this installed and working with more servers up and running over the next month. If they are just dedicated to Xbox Live and not integrated into a bigger framework like Azure...are they idling around all the time until X1 takes off? This scenario seems rather unrealistic to me. They currently only need 15.000 servers to run Live they said in the presentation, so this would be a dramatic waste of resources.

I am really interested in how they pull this off. 300.000 is such a large enterprise...just huuuuuuge!

They've been at it for awhile now, to the tune of $15 Billion already. There's a $1 Billion dollar cloud data center in Virgina, $500 million in Chicago. They've got them all over the globe actually, Singapore, Australia you name it. They're shipped as pre-assembled modules so they go up rather quickly.

But for something that's supposed to be so important to the platform's future, Microsoft has provided next to no details about it. Maybe they'll elaborate on it at E3, but if it really is their future proofing solution, you would have thought they would have showed it off a bit at their reveal.

Might have excited their base instead of adding more questions and confusion about what their plans are for gaming.
 
They've been at it for awhile now, to the tune of $15 Billion already. There's a $1 Billion dollar cloud data center in Virgina, $500 million in Chicago. They've got them all over the globe actually, Singapore, Australia you name it. They're shipped as pre-assembled modules so they go up rather quickly.

But for something that's supposed to be so important to the platform's future, Microsoft has provided next to no details about it. Maybe they'll elaborate on it at E3, but if it really is their future proofing solution, you would have thought they would have showed it off a bit at their reveal.

Might have excited their base instead of adding more questions and confusion about what their plans are for gaming.

My opinion of the reveal, from all of the talk after the show from MS and the Turn10 guy is that basically they showed what they could and the rest was not ready. Not showing something is not necessarily and indictment, as much as they just didn't want to show something in a poor state when they could wait 20 days for E3 and show something better. They hinted that we'd see the cloud at E3. For their sake, hopefully it is something good.
 
One thing I'd really like to know, is if any developer here, thought that he was being limited in terms of what he could do in regards to AI in a game.
And if he was, I'd like to know if the problem was computational power specifically allocated for AI purposes, or hit a wall somewhere before reaching that kind of limitation.
 
Last edited by a moderator:
One thing I'd really like to know, is if any developer here, thought that he was being limited in terms of what he could do in regards to AI in a game.
And if he was, I'd like to know if the problem was computational power specifically allocated for AI purposes, or hit a wall somewhere before reaching that kind of limitation.

Pretty much every game I've ever worked on could have used more cycles for AI.
Not so much the decision making that's cheap pretty much regardless of the mechanism used, but the construction of the world view.
Having to use rays instead of volume queries being the obvious one though not the only one.
And of course cost scales linearly with character count.
Having to drop to very low fidelity simulation relatively close to the player.

Of course I'm nor sure that moving all that to the cloud is worth the added complexity, and I'm not sure it results in an obvious fidelity improvement to the player either.

Games will use as many CPU cycles as you give them.
For whatever reason people seem to assume that console games are GPU bound, but IME many games are CPU limited far more often than being GPU limited.
 
First I think its critical they simply said 300,000 "servers." Just servers, no indication of physical or virtual. If you take a respectable 30:1 virtualization ratio you would need 10,000 physical host to handle those 300,000 servers (servers as in VMs, and that's very back of the napkin math). 10,000 hosts distributed globally isn't unreasonable. It's not a small project by any means, but its more than doable for a company with MS' infrastructure chops. They may also be counting "servers" differently based on some other internally standardized metric. Also, MS has modularized compute and storage into literal storage containers (partly for their "hybrid cloud" deployment and tests, IIRC). I'm not sure how they do it for their globally hosted services (Azure, O365, BPOS, etc) but they have a lot of knowledge and experience in this.

I believe they said they are going from 3,000 servers currently, to 300,000 for XBL. So by your math the current XBL platform runs on 100 (one hundred) physical servers. Does that makes sense to you? Seems very small for even what it does today to me.
 
Data centers are massively complicated and expensive. Giving everyone Xbox One or PS4 level processing out of the cloud would be absolutely insane. Giving them a small amount of additional processing is possible. It's an interesting debate to discover what type of computing is most reasonable. I imagine it'll be something like a job system, where games can leverage an API that sends data and a "program" to the cloud for computation. Something like GPGPU model. Dedicating cores to each user would not make a lot of sense.

They could do interesting things with group AI. Individual behaviour may be too time sensitive. Think of a game like GTA where you might have factions/gangs or groups of people, and you need to make group decision making. You tend to see very simple group behaviour in games. They like you or they don't. They do not exhibit any kind of leadership structure or group action. That's one area where the cloud could help significantly. You could have hierarchical structures with roles. Skyrim is another good example with warring factions that really never seem to be at war. They just wander around a bit and fight when they pass each other randomly. There isn't any kind of sophistication in how they think as a group. Just a thought that came to mind.
Yes, there are many unknowns regarding the cloud.

Microsoft are sceptical about the cloud themselves.

http://ap.ign.com/en/news/6766/microsoft-skeptical-of-cloud-hints-at-xbox-one-cross-platform-play

A cloud gaming based console would be dysfunctional without the servers, if Microsoft decide to stop using or paying for them one fine day.

This would greatly affect the software too and those who expended their hard earned money on games.

So to me it sounds okay as a buzzword but I am more interested in the cloud to save some of my files than games. :p
 
I believe they said they are going from 3,000 servers currently, to 300,000 for XBL. So by your math the current XBL platform runs on 100 (one hundred) physical servers. Does that makes sense to you?

Not necessarily. That's assuming by servers they mean VMs, and that's assuming they're going by the same 30:1 virtualization ratio I used as an example, which cannot necessarily be assumed. Probably should not be assumed as that's more of a very rough measure to give some idea as to potential consolidation ratios (depending on numerous factors). That's definitely possible though (30:1), depending on the compute, memory, and storage IO needs of the individual VMs. Deployments of any reasonable measure should be thoroughly and properly sized based on measured or expected requirements. That's something that MS would definitely do.

EDIT

Oh, I should also mention that generic virtualization ratios ignore the servers that remain physical! The physical servers needed (or used) to support that virtual infrastructure. Its quite common see large SQL clusters remain fully physical, for example.

EDIT 2

More than anything I simply wanted to indicate that their 300,000 server number could represent a lot more processing power than people may be thinking and that trying to take that 300,000 and divide it by the number of users, or number of games doesn't make much sense and probably isn't a very good way to look at things (not to mention the dynamic load distribution and balancing you get with virtual infrastructures). The small UCS deployment I referenced is a good example of how much compute you can put in a very small space.
 
Last edited by a moderator:
Yes, there are many unknowns regarding the cloud.

Microsoft are sceptical about the cloud themselves.

http://ap.ign.com/en/news/6766/microsoft-skeptical-of-cloud-hints-at-xbox-one-cross-platform-play

The article refers specifically to cloud rendering, which to me sounds like the OnLive/Gaikai model of 'cloud' rather than enhancing the local compute resources with cloud based ones.

Not necessarily. That's assuming by servers they mean VMs, and that's assuming they're going by the same 30:1 virtualization ratio I used as an example, which cannot necessarily be assumed. Probably should not be assumed as that's more of a very rough measure to give some idea as to potential consolidation ratios (depending on numerous factors). That's definitely possible though (30:1), depending on the compute, memory, and storage IO needs of the individual VMs. Deployments of any reasonable measure should be thoroughly and properly sized based on measured or expected requirements. That's something that MS would definitely do.

EDIT

Oh, I should also mention that generic virtualization ratios ignore the servers that remain physical! The physical servers needed (or used) to support that virtual infrastructure. Its quite common see large SQL clusters remain full physical, for example.

I think rather than a data center with 300k physical servers in it, its probably more like 30 data centers with 10,000 servers in it (physical boxes on which any number of virtual machines could run). One of the key aspects of OnLive performance was having proximity to a data center of 1000 miles for optimal latency performance (they quoted somewhere in the 40-80ms range). If that's the case they will need quite a few around the globe with some level of data synchronization as well. No small task but thats where all of the major tech services companies are headed.
 
Yeah, server stuff is super expensive. Just currently looking for a mini server for about 8k Euros and recently got an offer for one with 256cores for over 35k Euros.

As Gradthawn says, a server can be anything. On Amazon's elastic cloud, one EC2 CPU equivalent is basically one 1.6GHz Opteron core. You specify how much you're going to use it: light, medium og heavy load. Only if you state heavy load is the entire core dedicated to you. In the light case they probably oversubsribe the core by a factor of 4-10.

Amazon's new instances are basically equivalent to Core i7 3Ghz cores, with 3.25 times the performance of an EC2 CPU equivalent. That means you can have 13 EC2 equivalent "servers" running on a single quad Core i7 blade/server, oversubscribe this by a factor of 5 and a single physical server is 65 virtual ones. A single 1U HP DL120 with a E3-1240 Xeon and 32GB RAM can do this for less than $2000.

Cheers
 
I think rather than a data center with 300k physical servers in it, its probably more like 30 data centers with 10,000 servers in it (physical boxes on which any number of virtual machines could run). One of the key aspects of OnLive performance was having proximity to a data center of 1000 miles for optimal latency performance (they quoted somewhere in the 40-80ms range). If that's the case they will need quite a few around the globe with some level of data synchronization as well. No small task but thats where all of the major tech services companies are headed.

Yeah, I'm going to bet that that number, whatever metric they use to count, is probably representative of all global Live! resources. And in that manner, MS already has the facilities in place from their other hosted services. I don't know the regional distribution of them, however.
 
Last edited by a moderator:
Back
Top