Server based game augmentations. The transition to cloud. Really possible?

Typical response from you. :rolleyes: If you bother to read it there is some meaningful info in there worth digesting. For instance, it reveals how many data update can be handled in a large scale usage scenario. And it shows an 8-fold increase in the number of object dynamically modeled compared to what the local box can handle alone.

You try way too hard to downplay anything related to the platform instead of simply digesting it openly and drawing informed conclusions. Read the article again. It'll do ya good.

Yeah, really cool.

8x speedup from a 3x power increase. whoa, cloud power!.

Its useless because we have 0 idea whats actually happening behind the scenes with there code and whats being done, its a typical fluff piece designed to get fanboys to cream there jeans over it when they don't understand it means nothing (it seems to have worked as well).

You also do your best to play up anything XBONE related, even when its impossible PR fluff (dat more then super linear scaling).

Im sorry if I'm being a little too realistic for you, but this is so obviously just crap I'm really surprised you can't see it.
 
Last edited by a moderator:
Where's the tripe? It reads like a simple tech article. It seems like an implementation of MS' Orleans architecture.

It'd be a fine article if it was just on cloud computing, but its championing it for the XBONE, and they are claiming things which are near impossible (8x speed up with 3x faster hardware, over the internet).

The rest of its fine.
 
Yeah, really cool.

8x speedup from a 3x power increase. whoa, cloud power!.

Its useless because we have 0 idea whats actually happening behind the scenes with there code and whats being done, its a typical fluff piece designed to get fanboys to cream there jeans over it when they don't understand it means nothing (it seems to have worked as well).

If it had shown cloud doing nothing to improve the scenario's computation you'd be jeering it and telling us how it proved cloud isn't useful. :rolleyes:

You also do your best to play up anything XBONE related, even when its impossible PR fluff (dat more then super linear scaling).

What you assert out of hand is mere 'PR fluff' the rest of us here take seriously and have been discussing this entire thread. I linked to info about a tech demo. If you are just going to dismiss tech demos then don't bother posting, as clearly nothing done in this world can possibly satisfy you.

Im sorry if I'm being a little too realistic for you, but this is so obviously just crap I'm really surprised you can't see it.

You aren't being realistic, you are being cynical for the sake of downplaying the console, as usual. If you have a case to make, then whatever inherent bias might outline your thoughts isn't problematic but when you just sit there sniping it's not constructive. Ppl here wanted to hear more info about cloud applications and tech demos...here is one such tech demo.

It'd be a fine article if it was just on cloud computing, but its championing it for the XBONE, and they are claiming things which are near impossible (8x speed up with 3x faster hardware, over the internet).

Ha! Ok, so since it references a console you don't like it just has to be a fluff piece. Fascinating. You get a tech demo article and you immediately dismiss it because it paints Xbox in a pleasant light.

Btw, some dynamics applications (like the ones in the tech demo) can indeed see 'super linear scaling' in improvements. Many large scale dynamics simulations requrie certain computational thresholds to be met before the algorithms of the calculations can get meaningful results that can distinguish individual objects. You can see it all the time in areas like Biophysics or computational chemistry where folks apply density functional theory to recast many-body dynamics problem into forms amenable to simplified computations.

Welcome to science Captain Dismissive.
 
If it had shown cloud doing nothing to improve the scenario's computation you'd be jeering it and telling us how it proved cloud isn't useful. :rolleyes:



What you assert out of hand is mere 'PR fluff' the rest of us here take seriously and have been discussing this entire thread. I linked to info about a tech demo. If you are just going to dismiss tech demos then don't bother posting, as clearly nothing done in this world can possibly satisfy you.



You aren't being realistic, you are being cynical for the sake of downplaying the console, as usual. If you have a case to make, then whatever inherent bias might outline your thoughts isn't problematic but when you just sit there sniping it's not constructive. Ppl here wanted to hear more info about cloud applications and tech demos...here is one such tech demo.



Ha! Ok, so since it references a console you don't like it just has to be a fluff piece. Fascinating. You get a tech demo article and you immediately dismiss it because it paints Xbox in a pleasant light.

Btw, some dynamics applications (like the ones in the tech demo) can indeed see 'super linear scaling' in improvements. Many large scale dynamics simulations requrie certain computational thresholds to be met before the algorithms of the calculations can get meaningful results that can distinguish individual objects. You can see it all the time in areas like Biophysics or computational chemistry where folks apply density functional theory to recast many-body dynamics problem into forms amenable to simplified computations.

Welcome to science Captain Dismissive.

Extraordinary claims require Extraordinary evidence and these are Extraordinary claims and so far they haven't provided any evidence other then 'trust us'. Id be saying this about any console because I'm skeptical of such things because I know a lot of the issues behind. Instead of just blindly accepting things I question.

1.5mbit internet connection = 192KB/s
500,000 updates a second
vector 3 per asteroid (12 bytes, 96 bits, each)
192KB / 500,000 = 3.145728 bits/second/asteroid.

Whilst I have no problem with them showing off there cloud It would be nice if they made the disclaimer that it won't work at there advertised internet speeds.

As I sad I have no problem with them showing off there technology as long as its realistic and doesn't require 30x the speed they require.
 
Extraordinary claims require Extraordinary evidence and these are Extraordinary claims and so far they haven't provided any evidence other then 'trust us'.

They demo'd it in front of the press. :???:

Id be saying this about any console because I'm skeptical of such things because I know a lot of the issues behind. Instead of just blindly accepting things I question.

1.5mbit internet connection = 192KB/s
500,000 updates a second
vector 3 per asteroid (12 bytes, 96 bits, each)
192KB / 500,000 = 3.145728 bits/second/asteroid.

Whilst I have no problem with them showing off there cloud It would be nice if they made the disclaimer that it won't work at there advertised internet speeds.

As I sad I have no problem with them showing off there technology as long as its realistic and doesn't require 30x the speed they require.

You have no idea what is in those updates, so your calculation is useless. For instance, global gravitational forces don't require 3N values to describe what is happening in the tech demo. In fact, using your naive approach to representing the dynamics would actually require either 6N or 9N values anyhow depending on the inclusion of the momenta. Alas, that is not necessary. All you need are the gravitational potential wells which is a 4-dimensional mapping of the space. If all they are doing is sending a 4D acceleration field to the local box, the console can easily use that map to govern the motion of vast numbers of objects locally.
 
They demo'd it in front of the press. :???:



You have no idea what is in those updates, so your calculation is useless. For instance, global gravitational forces don't require 3N values to describe what is happening in the tech demo. In fact, using your naive approach to representing the dynamics would actually require either 6N or 9N values anyhow depending on the inclusion of the momenta. Alas, that is not necessary. All you need are the gravitational potential wells which is a 4-dimensional mapping of the space. If all they are doing is sending a 4D acceleration field to the local box, the console can easily use that map to govern the motion of vast numbers of objects locally.

And we still have no details about how it works or anything like that, all I see is them (probably on there nice fast internet connection) showing something off, how many tech demos have we seem that didn't any in any way shape or form represent the final product?

Also how big are these 4D accelerations fields, because they'll have to be might small to be send at 500,000 times a second, in the order of about less then a byte if my maths is correct.

EDIT:. Maths is wrong.

They have a max of 3.145728 bits / update. There is 0 chance this is being run on anything representative of the XBONE requirements.
 
Last edited by a moderator:

thax........ but wow..... this is not how nat works......

nat has 4 key points:

global inside address
global outside address
local inside address
local outside address

a table that maps those 4 allows for all/any permutation of nat. What complicate nat is when you use PAT (port address translation) and one or more of those 4 addresses becomes a */dynamic. it means the translation can only be initiated in one direction.

what they have done there is just make up some stuff to give someone some kind of idea how nat might work.
 
And we still have no details about how it works or anything like that, all I see is them (probably on there nice fast internet connection) showing something off, how many tech demos have we seem that didn't any in any way shape or form represent the final product?

So youa re dismissing it simply because it's a tech demo. Ooook.

Also how big are these 4D accelerations fields, because they'll have to be might small to be send at 500,000 times a second, in the order of about less then a byte if my maths is correct.

You most likely aren't bringing in the entire vector field with each update. There'd be no reason to. Just enough to be able to view whatever is in some viewing window. You aren't going to be viewing individual asteroids at a scale where your window fits the entire 35k lightyears inside it.

EDIT:. Maths is wrong.

They have a max of 3.145728 bits / update. There is 0 chance this is being run on anything representative of the XBONE requirements.

You aren't in any position to make such assertions. :rolleyes:
 
So youa re dismissing it simply because it's a tech demo. Ooook.



You most likely aren't bringing in the entire vector field with each update. There'd be no reason to. Just enough to be able to view whatever is in some viewing window. You aren't going to be viewing individual asteroids at a scale where your window fits the entire 35k lightyears inside it.



You aren't in any position to make such assertions. :rolleyes:

Yes I am, unless you can show me a update method that only requires 3 bits update, for how many thousands of asteroids?. Because if you can't then its clear that they using something thats a lot faster then 1.5mbit

Those 3 bits BTW aren't per asteroid its for the entire field.

But I guess you'll continue to believe that theres some voodoo magic going on in the background but I've come to expect this from you (dual APU's, ray tracing chips, TBDR).
 
Well... not that I think that makes it any less "tech demo-y", there's maybe no need to update once a frame. Generate the field and leave a "simple" simulation running locally and have the "big stuff" in the cloud. Now just update the fields necessary points when it's needed (collisions inside the field, for example).

Still makes it... hard to imagine this use case for anything game related, though. Or to put it differently. If there's a good case for precalculating it, do it, and stream it off the disc. Why waste precious real time computing power for something that doesn't need it.
 
I mostly agree with Betanumerical on this - it's not a decent piece but a lot of fluff leaving a lot to the imagination. A representation of 330000 asteroids is being extrapolated to modelling complex worlds, which is unrealistic. That's like showing an old-school physics demo of a zillion colliding boxes and then extrapolating that a dozen characters will have perfect cloth simulations. I don't doubt the truth of what they are doing, but I do doubt the relevance. It's a selected workload providing an ideal-case, not representative of real game workloads on the whole IMO. The fact that it's dressed up with nonsense PR claims like "infinite power of the cloud" should make it pretty clear what the purpose is.

If we look at the technical accomplishments, the cloud needs to send only positional data to the client (the fact it's an XB1 is immaterial to the discussion). As the objects are free-roaming in space, that's pretty simple data that I'm sure can be compressed in impressive ways. Simulating other data that requires more info to represent it locally is going to hit the BW barrier. Any character info will need position, rotation, and animating reference, and probably behaviour too so the local console can continue the simulation in realtime. We're left with what we were saying earlier, that the cloud can run simulations of the larger world and just provide (ahead of requirement) details on the immediate locale for the console to pick up local simulation.

The article has this comment from Henshaw:

"Game developers are building games that have bigger levels than ever before. In fact, game developers can now create persistent worlds that encompass tens or hundreds of thousands of players without taxing any individual console, and those worlds that they built can be lusher and more vibrant than ever before because the cloud persists and is always there, always computing," Henshaw said.
And once again, it's talking about persistent worlds in the cloud with squillions of players, and not realtime complex data like particle physics simulations being streamed to the box (which could probably be best served sent as a video feed and composited). That's no different to server-based MMOs, except that the worlds are distributed over compute nodes instead of a single rack machine.

Just done a little research and found this about Azure in the Top500 number 165:
http://www.top500.org/system/177982

Assuming XB1's cloud is Azure, the server setup is (as of November, so this could have Changed a lot):
Cluster Platform SL230s Gen8, Xeon E5-2670 8C 2.600GHz, Infiniband QDR
RPeak TFlops is 167. So the 'infinite power of the cloud' is ~140 XB1s. MS may have increased that massively, 100x more servers, and that'll be 14000 XB1's. Clearly available resources for each console are going to be very limited, although that's not really the topic here. The topic is what workloads can fit. We just have to be wary about taking PR presentations as a point of reference. The real people to ask are those working in cloud computing now who don't have an agenda to sell their CE product.
 
Exactly. There is no more P2P on Xbox Live. They are saying all online games will have dedicated servers. That's a pretty big deal for people who are into online play.

Amen... I'm stunned this doesn't get more attention. This might sound like an exaggeration, but this literally changes everything. Certain games that I enjoyed playing, but deliberately chose to restrict myself to offline play the large majority of the time where the experience and gameplay was far superior and to my satisfaction, will be given a whole new life and potential that they simply never had before without dedicated servers. NBA 2K and Fifa are two game franchises I have an incredible amount of fun with. But those two games with dedicated servers puts them into entirely new territory for me in terms of raw value.

Games like Titanfall or Destiny seem like they're far more viable and a much bigger deal than might have otherwise been the case without dedicated server support. Look at Division, yet another game with dedicated servers (Ubisoft's setting up their own servers for both PS4 and XB1), and you start to see potential for games that literally live online to really become the dominant and possibly best selling games this generation. Consider everything we saw from Titanfall. Now imagine trying to go the p2p route with that game. It feels almost like a comeback for popular arena titles of the past is on the horizon, but with far bigger things on the horizon. Games like Destiny, Titanfall, Division, Watch Dogs, among others, seem especially well positioned. Not sure what cloud computing means, but the safe bet so far seems like nice dedicated server support for most games, which is already a big win. After that, everything else is just gravy if it pans out.
 
Last edited by a moderator:
I don't doubt the truth of what they are doing, but I do doubt the relevance. It's a selected workload providing an ideal-case, not representative of real game workloads on the whole IMO.

Are you guys new to tech demos or something? They always include PR jibber jabber. Those can be ignored, as always. The point is that there is good info there. We now have an idea of how many updates such a tech demo is sending in each second, even if we still dunno how much data is being represented or the connection being used. We have an idea of what kinda scaling is possible in the cloud and we have a better idea of how the workload is being split up: big workloads in cloud and streamed in gradually based on what is in viewing window.

If we look at the technical accomplishments, the cloud needs to send only positional data to the client (the fact it's an XB1 is immaterial to the discussion). As the objects are free-roaming in space, that's pretty simple data that I'm sure can be compressed in impressive ways. Simulating other data that requires more info to represent it locally is going to hit the BW barrier.
They are most likely sending in nothing but updates to the local gravitaional acceleration field. That's a spatial distribution only. The local box then can run the physics based on the incoming field data for the viewing window.

Any character info will need position, rotation, and animating reference, and probably behaviour too so the local console can continue the simulation in realtime. We're left with what we were saying earlier, that the cloud can run simulations of the larger world and just provide (ahead of requirement) details on the immediate locale for the console to pick up local simulation.
Right...and this is a demo of the scale that is an upper limit on what can be done with persistent physical worlds. Hence, it is useful information. Character AI and anims are totally different. This demo doesn't show those aspects. It wasn't meant to.

And once again, it's talking about persistent worlds in the cloud with squillions of players, and not realtime complex data like particle physics simulations being streamed to the box (which could probably be best served sent as a video feed and composited). That's no different to server-based MMOs, except that the worlds are distributed over compute nodes instead of a single rack machine.
Persistent worlds (not npc's, rather the environments) can be animated using physics. Calculations can be on-going in the cloud with results can be streamed in when the player gets close enough to actually see them. That is exactly what is going on here. Sure, you won't have a great game where all the box can do is use its computing muscle to move rocks around in space and nothing else...

Nobody is talking about real time particle fx. Why are you bringing up aspects that nobody ever suggested was being done as if it is somehow detrimental to the purpose of the tech demo? You're reaching.

Assuming XB1's cloud is Azure, the server setup is (as of November, so this could have Changed a lot):

RPeak TFlops is 167. So the 'infinite power of the cloud' is ~140 XB1s. MS may have increased that massively, 100x more servers, and that'll be 14000 XB1's. Clearly available resources for each console are going to be very limited, although that's not really the topic here. The topic is what workloads can fit. We just have to be wary about taking PR presentations as a point of reference. The real people to ask are those working in cloud computing now who don't have an agenda to sell their CE product.
We already know what resources are available in the cloud. 3 X1 CPU's worth and 3 times the RAM allotment per real world console. It's stupid to conflate a tech demo with a 'PR presentation' just to dismiss it. The presentation is immaterial. What matters is the tech and what it is doing. This tech demo gives us a starting point as an upper limit for what can be computed in the cloud in terms of scaling physics-based actions to be rendered locally.




Yes I am, unless you can show me a update method that only requires 3 bits update, for how many thousands of asteroids?. Because if you can't then its clear that they using something thats a lot faster then 1.5mbit

Those 3 bits BTW aren't per asteroid its for the entire field.

But I guess you'll continue to believe that theres some voodoo magic going on in the background but I've come to expect this from you (dual APU's, ray tracing chips, TBDR).

1) I NEVER any of the bolded. Stop lying about what others are saying. If you're a compulsive liar then lemme know ahead of time and I'll add you to the ignore list. This is now the 3rd time I've had to correct you on this.

2) You are asserting that none of the tech demo actually happened even thought it was done in front of the press. That's cute. Give me proof. Extraordinary claims require extraordinary evidence...all you've shown for evidence thus far is that you have no idea what you are talking about or what is happening in such a tech demo. Hardly grounds for a compelling argument to support your assertions and numerous accusations.

3) What is being 'updated' isn't the entire global field. It's just what is shown in the viewer. Also, your starting point for your lil 'math' there is assuming a 1.5Mbit/s connection (because that's what MS said is useful). They NEVER said that was enough to do a tech demo or anything related to the cloud computing aspects of the platform. Remember, this isn't latency sensitive stuff here. It takes the user time to move through asteroid fields and traverse 35k ly's worth of space in the demo. During the time it takes the user to move more data is streaming in.

4) It's rather arrogant to presume that since YOU don't understand how it might work off the top of your head, therefore the tech demo that devs spent however much time making must therefore be completely fake. You don't know as much as you've convinced yourself you do.
 
Are you guys new to tech demos or something?
No, and that's precisely the point. Public tech demos exist to sell a tech. They are a marketing tool, that present a best-case for your product.

The point is that there is good info there. We now have an idea of how many updates such a tech demo is sending in each second.
How is that good information? We've no idea of the size of that information, or how it's grouped. It could be, for all we know, 500 packets of 100 updates. Or it could be 500,000 individual messages. We've no details on the internet connection type so no knowledge of what that update means. If we take a 5 megabit connection, that's 500,000 10bit updates. What are they representing with those 10 bits (or did they have 100 bits per update to play with?) What does that tell us about cloud computing? Please explain the insight this is giving you, because I'm sure not seeing it! ;)
 
No, and that's precisely the point. Public tech demos exist to sell a tech. They are a marketing tool, that present a best-case for your product.

Why is that somehow 'useless' to the purposes of the discussion this thread is premised upon? We've been trying to establish boundaries for best/worst case scenarios here to build discussion from as a complement to the PR claims and info from devs. Having a best case scenario isn't something to spin as bad news for a topic like this one.

How is that good information? We've no idea of the size of that information, or how it's grouped. It could be, for all we know, 500 packets of 100 updates. Or it could be 500,000 individual messages. We've no details on the internet connection type so no knowledge of what that update means. If we take a 5 megabit connection, that's 500,000 10bit updates. What are they representing with those 10 bits (or did they have 100 bits per update to play with?) What does that tell us about cloud computing? Please explain the insight this is giving you, because I'm sure not seeing it! ;)

Yes, we don't know all the details. My point is that just because we don't have all the details nor the tech demo running in front of us we shouldn't just ignore the info that we ARE given. Doing so is just dumb.

Thus far all you are doing is dismissing it simply becuase you don't have all of the details. That is neither rational nor conducive to the thread YOU started Shifty. Is it really so difficult to just say "Hmmm...not sure how they are doing this and I'd like to know more before bothering to discuss it..."? Come on now.
 
Yes, we don't know all the details. My point is that just because we don't have all the details nor the tech demo running in front of us we shouldn't just ignore the info that we ARE given. Doing so is just dumb.

What info?!
All we have just PR talk form MS.
DF itself could not find solid info about MS cloud.

Thus far all you are doing is dismissing it simply becuase you don't have all of the details. That is neither rational nor conducive to the thread YOU started Shifty. Is it really so difficult to just say "Hmmm...not sure how they are doing this and I'd like to know more before bothering to discuss it..."? Come on now.

Frankly we produced better arguments and ideas about how to use cloud than MS.
 
I've listened to some podcasts now, and basically all devs so far agree that the features microsoft offers are as useful as Live was back in the day for taking matchmaking and such out of the hands of devs. There are server side resources that can be used for anything, like hosting online matches, processing and serving data while the console/game itself is online, and anything else a crazy dev comes up with. You don't have to buy or provide these resources, they are available per player on MS's platform, and supported in the SDK by default.

incidentally, Uncharted used Amazon for multiplayer stuff, so I'm taking from that there is at least one more player besides Microsoft and Google. And Sony may want to build this with GaiKai, but clearly they are behind in this aspect. Will be interesting to see how it plays out.
 
Yes, we don't know all the details. My point is that just because we don't have all the details nor the tech demo running in front of us we shouldn't just ignore the info that we ARE given. Doing so is just dumb.
Okay, ignore the dismissing it for PR purposes, and please explain what this is showing us. I'm happy to learn what this is showing us. As I say, I'm not seeing any useful information. Now's your chance to educate me. ;)
 
Okay, ignore the dismissing it for PR purposes, and please explain what this is showing us. I'm happy to learn what this is showing us. As I say, I'm not seeing any useful information. Now's your chance to educate me. ;)

I did so once already. We have an 8 fold improvement to the number of physics-based objects that can be simulated with 4 times the processing power (relative to the local box). While we dunno the connection speed, we DO know that it worked seamlessly according to the press reports (there are many of them, not just the one I linked to).

We also have an idea for how it was implemented, as another story on the tech demo noted that you couldn't see outside of a fixed view window, which tells us they were almost certainly streaming in that data gradually over time...that makes it possible to do everything the press says they did in the demo even without a massive pipe to send data through. It's something I noted a while back too. If you bring the data in gradually, you can do a lot since it takes the user/player time to traverse a game's world in order to even see stuff that has been streamed in.


What info?!
All we have just PR talk form MS.
DF itself could not find solid info about MS cloud.

Frankly we produced better arguments and ideas about how to use cloud than MS.

How do you know what devs are doing? And DF's "analysis" was retarded. It totally missed the entire point (offloading frees up local resources) and spent paragraph after paragraph trying to compare MS's approach to the challenges of Sony's, which is stupid for obvious reasons.

The new info here is that there really is a helluva lot of computational benefits to doing the heavy lifting in the cloud. Gaining an 8 fold improvement to the number of objects being simulated is quite impressive. It gives us an upper bound to work downwards from.

Or we can just ignore any and all new info and just continue wandering through the topic blindly as we have thus far. Because a that's totally reasonable approach for onward discussion. :???:
 
Back
Top