Server based game augmentations. The transition to cloud. Really possible?

We have an 8 fold improvement to the number of physics-based objects that can be simulated with 4 times the processing power
:???:

Huh? How should this be even possible? 8 times improvement with only 4 times the power? Did you write it wrong? Is this what it said??

The only way to achieve superlinear scaling I know of is due to caching effects...but then we are talking about 10, maybe 20% max additional improvement to the linear scaling.

So what is going on?
 
Last edited by a moderator:
Huh? How should this be even possible? 8 times improvement with only 4 times the power? Did you write it wrong? Is this what it said??

Correct...it went from having the local box able to handle ~40k objects by itself to the box + cloud handling ~320k.

The only way to achieve superlinear scaling I know of is due to caching effects...but then we are talking about 10, maybe 20% max improvement.

So what is going on?

Not necessarily, you can actually do it quite easily depending on what the application is and what approach/algorithm is being utilized. For instance, there are ways to model the handling of physics-based objects as a collection of force fields that interact with one another. Once you have the local box dropping tons of objects into these force fields the output is a whole bunch of objects moving around and rendered locally. The arrangement of deterministic force fields is what gets handled in the cloud. That is something that can be exploited for a demo like this with gravitationally-governed asteroids.

Also, it's important to note that the number of objects in the viewing window most likely was fixed. So as the user moved around new adjustments to the local physics parameters could have been streamed in and run locally thereafter. During the time it takes the user to traverse the region of space in the demo, depending on how fast the movement was, you'd presumably have loads of time before even needing to stream in new data.

I had noted this flexibility earlier in the thread. Since it takes players time to move within interaction range with objects or events triggered in the game world, you can have significant delays, which is equivalent to having several fold "extra" processing than you'd normally expect when comparing the results to a real time, locally calculated solution.

If it takes me 10 seconds to walk across the terrain to get to a building in BF4, that building's collapse can be handled in the cloud and spread out and streamed into local memory across those 10 seconds. If I planted C4 or some sort of time-detonated bomb, same deal. Having a window automatically means you are capable of seeing nonlinear computing results...not because the processors in the cloud are special but because they have way more time to deal with a workload compared to a local box's real time, "instant" handling of physics based stuff like that.
 
DF's "analysis" was retarded. It totally missed the entire point (offloading frees up local resources) and spent paragraph after paragraph trying to compare MS's approach to the challenges of Sony's, which is stupid for obvious reasons.

The new info here is that there really is a helluva lot of computational benefits to doing the heavy lifting in the cloud. Gaining an 8 fold improvement to the number of objects being simulated is quite impressive. It gives us an upper bound to work downwards from.

Or we can just ignore any and all new info and just continue wandering through the topic blindly as we have thus far. Because a that's totally reasonable approach for onward discussion. :???:

Atleast DF didn't claim a 8x speedup from 4x the power.

The other way to think about is that they are probably providing more then 4x the speed from the cloud from the demonstration, id say they are using closer to 10x the power (and losing 2x due to some inefficiencies and imperfect scaling). This is much more realistic but its also more then double what they are going to provide for the end product.

To be perfectly honest as soon as these words are uttered in a article

"the cloud brings infinite additional processing power."

Its pretty damn obvious what its purpose is.


If it takes me 10 seconds to walk across the terrain to get to a building in BF4, that building's collapse can be handled in the cloud and spread out and streamed into local memory across those 10 seconds. If I planted C4 or some sort of time-detonated bomb, same deal. Having a window automatically means you are capable of seeing nonlinear computing results...not because the processors in the cloud are special but because they have way more time to deal with a workload compared to a local box's real time, "instant" handling of physics based stuff like that.

We have been over this hundreds of times with you, you cannot do physics that require interaction with the player on the cloud for a myriad of reasons, so for the love of god stop harping on about it.
 
If it takes me 10 seconds to walk across the terrain to get to a building in BF4, that building's collapse can be handled in the cloud and spread out and streamed into local memory across those 10 seconds. If I planted C4 or some sort of time-detonated bomb, same deal. Having a window automatically means you are capable of seeing nonlinear computing results...not because the processors in the cloud are special but because they have way more time to deal with a workload compared to a local box's real time, "instant" handling of physics based stuff like that.

What about latency? What about QoS? What happens to the building if a lot of packets are dropped? Why add all that complexity when you could at that point just have the game run remotely and stream the output to the console.

Doing any kind of latency sensitive operations just add pointless engineering complexity and greater chance for failures, ultimately leading to a worse experience for the end-user.

All this cloud business is just a way to keep a storage database and have simple scripts run just like any PC MMO does. Only difference is that they will let developers lease that infrastructure for their software.

Its all marketing BS.
 
Correct...it went from having the local box able to handle ~40k objects by itself to the box + cloud handling ~320k.
Problem is, we have no clue what this program is actually doing. "track the orbital velocity of 40,000 asteroids in space" or "take the number of asteroids from 40,000 to 330,000, and any device doing the computational math to realistically in real-time chart the orbital velocity of 330,000 asteroids" isn't telling us anything about the computational effort involved. As I assume the XBoxOne can't track the asteroids directly with its Kinect camera, it probably does some kind of n-body simulation. But when dealing with asteroids in a solar system, I would be surprised if a full O(N²) scaling force calculation would be done. For high particle numbers a computationally much cheaper treecode (scales with N*logN) or some clever hybrid between mesh- and treecode would be preferred I think (epecially as one has relatively few heavy objects [sun, planets, moons, and the larger planetoids]). But as gravitation is a quite weak force, one can actually also pull off a real time simulation with exact force calculations and 330,000 asteroids locally on the XBox with reasonably high precision, which is determined by the size of the time step. I wouldn't want to launch a hundreds of million $ spaceprobe based on that calculation, but it shouldn't end up too far away from the intended course, i.e. visually undistinguishable on a FullHD screen depicting the solar system (a pixel would be ~10 million kilometers :LOL:). Considering the involved length scales in the solar system, the realtime movements are quite slow after all.

Without knowing what they have acually shown, it's hard to judge what it's worth. As you see, one can also make an argument that they just used awful code to start with and one doesn't need the cloud at all. ;)

And as other have mentioned already, the 500,000 updates per second from the cloud are highly suspicious. If that are just the coordinates of the asteroids calculated in the cloud, they have an update frequency of ~1.5 Hz (500,000 updates/330,000 particles). One may want to interpolate between updates locally if it's supposed to be a fluid animation. Anyway. Such an update would be minimally 12 Bytes (just position) or even 24 Bytes (position + velocity to interpolate more correctly between updates) That would amount to 6 or 12 MB/s needed bandwidth. The latter is basically the theoretical maximum for a 100 MBit line. Did they do a bandwidth test for a 100MBit ethernet port? Really? :rolleyes:
 
Last edited by a moderator:
I'm going to have to side more with astrograd on this issue.

I don't see why it is believed that a jump 40,000 to 300,000 asteroids being tracked is impossible? What if hypothetically the xbox one is using 90% of it's CPU and GPU power on processing what is on screen and other local game effects? Perhaps you only have 10% of the CPU left over to process the remaining universe of asteroids? If you have the equivalent of 3 Xbox One cpus in the cloud available to just crunch asteroid movement then you now have way more then 3 times the compute resource available to track asteroids. This hypothetical situation gives you 30 times more!

The change in number of asteroids tracked should not be the issue here. Also you only need position / velocity updates of the few asteroids in your view.
 
Also the 500,000 updates per second is clearly the updated state in the cloud. Why do all these updates need to be delivered to the Xbox One? That is not the point. The idea of the demo is to show you the possibility of persistent worlds.
 
I did so once already. We have an 8 fold improvement to the number of physics-based objects that can be simulated with 4 times the processing power (relative to the local box). While we dunno the connection speed, we DO know that it worked seamlessly according to the press reports (there are many of them, not just the one I linked to).
Okay, you've explained the obvious of the article. What does that tell us about cloud computing for gaming? That we can process more in the cloud than on the local console (AFAICS your 8 fold improvement in complexity with 4x the cloud power comes from assuming they are using 4x XB1 in the cloud, which isn't mentioned anywhere in this demo. We've no idea how much cloud power was being used to run the emulation. the thing with the cloud is it works like that and you just throw problems it; it's not compartmentalised into fixed hardware allocations for clients.)? That's kind of a no brainer. The subject in this thread and every investigation has been what actual game workloads is that going to allow us to offload.

...that makes it possible to do everything the press says they did in the demo even without a massive pipe to send data through. It's something I noted a while back too. If you bring the data in gradually, you can do a lot since it takes the user/player time to traverse a game's world in order to even see stuff that has been streamed in.
Everyone else has mentioned that, even your hated DF article. That's not new info.

The new info here is that there really is a helluva lot of computational benefits to doing the heavy lifting in the cloud.
That was never the concern. The concern is what type of workloads can you do to contribute to a game. We've already talked about persistent worlds and background tasks. I don't see what new info this demo is bringing. It's showing that everything we've said so far is true?
 
I don't see why it is believed that a jump 40,000 to 300,000 asteroids being tracked is impossible?
Of course it's possible! MS are showcasing it for real. No-one's doubting the reality of the demo. It's what it means and how it relates to games that's under question, because modelling asteroids is very different to other workloads you'd want in a game, and this demo hasn't done anything to describe how the bandwidth limit is dealt with.

So, for example, for you or Astrograd to explain, how can the power of the cloud simulate 43 trees and four flags blowing in the wind (which it can certainly do) and get that animated information to the console in 40-150 kbs a frame? Or, if the cloud's not for realtime graphics like this, what new info is the asteroid demo giving us when all along we've been saying that the cloud can do background simulation and stream the content in non-realtime to the console to use?
 
Also the 500,000 updates per second is clearly the updated state in the cloud. Why do all these updates need to be delivered to the Xbox One? That is not the point. The idea of the demo is to show you the possibility of persistent worlds.
That's not what this Henshaw guy claimed: "We have about 500,000 updates per second coming from our global computing cloud down to this Xbox One".
Anyway, if they just stream some orbital data for the nearest asteroids (those in view) from a large data base (it's easy to precalculate and actually way more efficient if that is supposed to be used as a game content for a lot of players), then it immediately gets quite boring.
 
Anyway, if they just stream some orbital data for the nearest asteroids...
I'm not sure about that:

The first tech demo Henshaw had on display showed a ridiculous number of pink dots floating on the screen. Those dots, it turned out, represented 40,000 asteroids. According to Henshaw, the Xbox team approached NASA in order to snag some information regarding the galaxy, tracking planets, asteroids, and more out to 3,500 light years from our sun. This particular demonstration had the single Xbox One handling 40k asteroids, their speed and trajectory in real time.

At the push of a button, another 290,000 dots littered the screen. Henshaw said that the Xbox One housed computing power equal to more than 10 Xbox 360s, which is how it was originally tracking 40,000 asteroids at once. Utilizing the cloud, however, those computations are dramatically increased.
Sounds like there's no object culling but every object's in view as a dot. Which does bring into question the nature of the updates. How are those 300k dots being represented in the data from the cloud? That's the info that would actually make this demo interesting!
 
I'm not sure about that:

Sounds like there's no object culling but every object's in view as a dot. Which does bring into question the nature of the updates. How are those 300k dots being represented in the data from the cloud? That's the info that would actually make this demo interesting!
From that description it sounds again like the bandwidth test I mentioned. With the update rate one arrives basically at 100MBit/s needed bandwidth. And I don't see an immediate connection how this would help a game neither.
 
Without any description of what connection it was using for the cloud, we can't hazard a guess as to what the demo actually shows. Is it a fancy data representation that describes 300k objects at 2 megabits a second, or is it a demo of the processing power in the cloud when you don't have to factor in realistic constraints because they have a 100 Mbps connection?
 
Not really. You still have pros and cons of dedicated servers. It's not all rosy if you think about it. There are still tradeoffs. Of course the best solution is to have a combination of P2P and dedicated servers, so I hope Microsoft doesn't go the idiot way and require all games to be done on dedicated servers.

Dedicated servers are limited by geographic location. This may be all fine if you're living in the US, for example, as dedicated servers, are probably littered throughout the US, but it's a very different picture for other countries. Lets me give a real world example of when dedicated servers may not be the best choice.

I live in Taiwan, and quite frankly there are very little dedicated servers situated in Taiwan despite a not-so-small user base.
In this area usual "dedicated servers" are situated in Singapore, or Japan. When we play P2P with other players in Taiwan, we can achieve < 30 ms easily, and enjoy very good ping times and excellent connection. However, once we switch to dedicated servers, Singapore servers and Japan servers start at 65 ms minimum, so in this case dedicated servers actually is worse for us. So unless Microsoft can really litter the entire world with dedicated servers, dedicated servers isn't a guaranteed plus. It's a good addition to P2P, but they better not wipe out P2P outright.

This was specifically addressed in the GiantBomb livestream interview with Respawn. I listened to the podcast version initially and then watched the video livestream archive on their twitch channel to make sure I caught the details correctly.

The way MS is exposing the cloud services to Respawn means that Respawn doesn't have to worry about provisioning servers because this happens automatically on demand.

They explained the process like this:

  1. Player starts game session.
  2. XBOne automatically connects to closest data center (lowest ping).
  3. If there is no server instance (servers are VMs) available, either because none is running or all existing instances are full, another server is automatically created.

They specifically touted how this would lead to better service for areas that traditionally have been poorly supported (they used Australia as an example).

This is the benefit of MS already having cloud services available worldwide. Azure has been a functioning commercial product since 2010 and was first announced in 2008. This represents a significant amount of capital expenditure and time investment that MS have already completed allowing them to hit the ground running with the services they will be offering to developers and XBOne owners. Sure, being software, cloud functionality can be replicated by anyone. But the infrastructure that supports it takes time and a lot of money to build out.

MS have a minimum three year lead here and I don't see how anyone in the console space catches up. So, it makes sense for them to be pushing their cloud platform as an advantage. I just wish they'd stick to promoting it in a more realistic way, using tangible and obvious advantages (like the example above) instead of vague and questionable ones.
 
Last edited by a moderator:
Three years head start?

Not sure about the money, but I don't doubt that Sony or Nintendo can could commission Google or Amazon for servers, too... it's not magic.
 
Three years head start?

Not sure about the money, but I don't doubt that Sony or Nintendo can could commission Google or Amazon for servers, too... it's not magic.

No matter how favorable the contract I can see no way while having your costs and buildout controlled by a third party that you can create parity with an organization that has this all in house.

There's a reason Gaikai won't be available at launch and will be limited to the US when it does.
 
I did so once already. We have an 8 fold improvement to the number of physics-based objects that can be simulated with 4 times the processing power (relative to the local box).

But we are really just talking about data visualization here. A PS4 or a PS3 could do the same or quite similar things. Why not just do it with a browser on a PC running asm.js http://asmjs.org/faq.html :D Google might just bypass everybody and put out a gaming Chrome OS console :LOL:

There are obvious pluses to server side gaming stuff ( matchmaking, multiplayer, MMO type persistence ) there is nothing special being done with XB1 that can't be done with any other gaming console or PC for that matter.

In Titan Fall the devs say that all of the multiplayer AI in "the Cloud" but it's a PC game as well as a 360 game so does that mean ONLY the XB1 will have AI ?? So either MS allows anybody to use the Power of The CLOUD or the devs will keep the AI code in the game ?

I am sure cross-platform issues have already been discussed in this thread so I won't bother to state the obvious issues there.
 
Last edited by a moderator:
In regard to TitanFall's could based AI, how is this any different to the way PC games dedicated servers handle bots? Quake 3, UT, CS:GO all have AI which are run the server. OK the AI might not be the most advanced but these games are old, and the concept is the same.
 
Back
Top