Server based game augmentations. The transition to cloud. Really possible?

Well you have added energy bill and cooling costs. Perhaps/perhaps not the need for increased server capacity in some regions because you don't want these xbox gamers virtual instances impacting the cloud services that really pay the bills.

So the question becomes in an extremely rough estimation roughly how demanding is the physics demo? Azure appears to have overheads so you are left with the equivalent of 1.6ghz and 4 out of 6 Opteron 4171HE cores.

After all the other operating costs of Xbox Live & games for gold does the remaining profit cover the added energy bill?
 
It is confirmed that the Build 14 cloud demo was an early Crackdown work:

https://twitter.com/XboxP3/statuses/476155955357306880

And some things are calculated using the cloud:

"A couple of things happen when, say, a building gets destroyed in a game. You've got the physics calculation of all the pieces that something's going to break into and all of what happens to those pieces as they collide with one another. And you kind of, in the truest sense, want it to be somewhat non-deterministic, meaning that if I shot, like, say, a missile from one angle instead of a slightly different angle, that the destruction looks different based on the pure physics of the impact. So what we've been working on is this capability of actually computing [in the cloud] the physics calculation of millions and millions of particles that would fall and then just having the local box [the player's console] get the positional data and the render, so, 'Okay I need to render this piece at this particular location. I don't know why.' The local box doesn't know why it's going to be at this location or where it's going to be in the next frame. That's all held in the cloud. You have this going back and forth between the two.
"That's just an example, because it's the example that we showed. Let's run getting a pure physics model in the cloud, because we can use multiple CPUs there and then locally using the power [of the console] to render and make things look good locally.

http://kotaku.com/the-new-crackdown-will-use-the-cloud-a-lot-1589866608

And Crackdown is using the cloudgine engine:

Cloudgine are delighted to announce that they are working with Microsoft on the upcoming title, Crackdown for Xbox One!


We are immensely excited about this project and the opportunity to create an entirely new experience in the Crackdown universe, one which we truly believe will set a new bar for open-world gaming.


http://www.cloudgine.com/projects.html
 
So like, we went from "cloud physics is impossible" to "it'll be too expensive for the developers" to "it'll be too expensive to MSFT"

I'd argue it's not, otherwise why would they give it for free?

what's next? "cloud physics will cause [strike]global warming[/strike] climate change?"

Honestly this slippery slope is getting way too slippery.
 
So like, we went from "cloud physics is impossible" to "it'll be too expensive for the developers" to "it'll be too expensive to MSFT"

I'd argue it's not, otherwise why would they give it for free?

what's next? "cloud physics will cause [strike]global warming[/strike] climate change?"

Honestly this slippery slope is getting way too slippery.

My position is still "cloud physics is impossible" (actually it's not but I'm buying into your false dichotomy here), we still have to see the evolution from the BUILD demo to what arrives in shipping products that have to deal with real world internet, not private leased lines. Given I still randomly have a hard time streaming 480p YT and I have 150Mb/s this will be interesting. I hope it's real as I would love to be able to just flatten a city, it would be the sequel to Red Faction: Guerilla we never got :(
 
So like, we went from "cloud physics is impossible" to "it'll be too expensive for the developers" to "it'll be too expensive to MSFT"

I'd argue it's not, otherwise why would they give it for free?

what's next? "cloud physics will cause [strike]global warming[/strike] climate change?"

Honestly this slippery slope is getting way too slippery.
That's not a technical discussion and if it weren't for the reply, I'd delete this post. The arguments against cloud computing have all been laid out. Instead of taking it on faith, we're reasoning the probabilities. That reasoning can be wrong (see 8GB PS4), but the reasoning process should be intelligent and based on factual arguments instead of wishy-washy acceptance of PR spiel.

To date, there has not been one demonstration of complex cloud physics running over the internet. We can look at the facts of the Cloudengine engine - if they calculate physics meshes and positions and send the particle data to the console, they have to send significant amounts of data per frame. Or at least, send mesh data once and then update rotation and position every frame. The maths for that is already in this thread along with conclusions.

If you want to partake in the conversion, do so properly and intelligently on the technical level instead of from an unjustifiable high-horse of blind, unreasoned faith.
 
That's not a technical discussion and if it weren't for the reply, I'd delete this post. The arguments against cloud computing have all been laid out. Instead of taking it on faith, we're reasoning the probabilities. That reasoning can be wrong (see 8GB PS4), but the reasoning process should be intelligent and based on factual arguments instead of wishy-washy acceptance of PR spiel.

To date, there has not been one demonstration of complex cloud physics running over the internet. We can look at the facts of the Cloudengine engine - if they calculate physics meshes and positions and send the particle data to the console, they have to send significant amounts of data per frame. Or at least, send mesh data once and then update rotation and position every frame. The maths for that is already in this thread along with conclusions.

If you want to partake in the conversion, do so properly and intelligently on the technical level instead of from an unjustifiable high-horse of blind, unreasoned faith.

Besides the position data sent from the server might there be a fair amount of predictions and interpolations to help with keeping the network load down? Killzone did a bunch of this when it came to multi-player frame rates iirc. Maybe even running at 15 frames or less a second with good guesses and such you could get decent performance. Sorry if this has already been suggested
 
Possibly, although the quote says:
'Okay I need to render this piece at this particular location. I don't know why.' The local box doesn't know why it's going to be at this location or where it's going to be in the next frame.
If they've pulled this off, there must be a clever data compression scheme in effect. Or, the effect is going to be significantly pared back to calculate fewer pieces enough to fit the data stream. As mentioned in the earliest days of this discussion, the primary resource bottleneck is bandwidth, so problems either have to fit that, or find solutions around it.
 
Ah yes reading would help on my part :p Compression scheme and/or maybe a subset of polygons and particles and a certain range of sizes or some other shortcut or " templates " that maximize the effect of every packet from the server. I mean that's compression as well but more of a "parametric" version. Sorry about the quoted jargon but I'm on the move and my brain is "interpolating" words on its own :)
 
So I find it interesting that Cloudgine will be part of UE4, and if and how other MS studios will use that.
 
If you want to partake in the conversion, do so properly and intelligently on the technical level instead of from an unjustifiable high-horse of blind, unreasoned faith.

blind faith?

I think I've commented a while ago on how I think the challenges of cloud physics is early in the thread, it's pretty much exactly what you have just repeated here.

I was ridiculing those that turned this discussion from a technical aspect to a cost debate, not sure why it's so hard to see:

According to that 40 servers for 1 hour would cost $10 theres your profit from your game gone.
Interesting point; so even is the transition was possible, it will never be cost effective?
okay, but now, think about a first party title

And I don't even understand what this is trying say. Those are the ones that should partake properly and intelligently.
 
Last edited by a moderator:
blind faith?
Your post wasn't a technical argument but an appeal to faith it'll work.

I was ridiculing those that turned this discussion from a technical aspect to a cost debate, not sure why it's so hard to see:
You're right, the cost debate doesn't belong here. It's been pruned back a number of times IIRC.
 
My position is still "cloud physics is impossible" (actually it's not but I'm buying into your false dichotomy here), we still have to see the evolution from the BUILD demo to what arrives in shipping products that have to deal with real world internet, not private leased lines. Given I still randomly have a hard time streaming 480p YT and I have 150Mb/s this will be interesting. I hope it's real as I would love to be able to just flatten a city, it would be the sequel to Red Faction: Guerilla we never got :(

My expectation is that Crackdown 3 won't ship before 2017 and by then the cloud-based destruction will have been scaled back dramatically.
 
I figure cloud computing would be best used for running game worlds a la what we already have these days.

What would be more compelling to me is perhaps a rebirth of the LAN party by hooking up multiple Xboxs to parallel process game worlds for MP, or in Halo's case a highly advanced SP scene. Sure you wouldn't have the fine tuned bandwidth and latencies of an actual server rack, but it would be a breeze compared to expecting a clean, low latency, and not to mention fast internet connection. Depending on the game, it might even convince players to buy multiple consoles for single player scenarios where you can get the "full experience" when played with two or three Xbox Ones flying in formation.
 
Seen this on Reddit & thought it might spur some good technical conversation...

/u/JonnyLH said:
I just calculated an estimate of the data rate for the Crackdown demo shown at Build. Obviously there's a couple more variables involved, for example, how the building breaks and the shape of the chunks. Would they derive from the local box which then gets sent up to Azure? Presumably a server application which would have the collision meshes of the map so it can sync up with the local box, it'd first receives the variables around the explosion like size, direction, radius etc.

Data Rate
UPDATED: Rather than real-time calculating of every chunk, 32 times a second, /u/caffeinatedrob recommended drawing paths which I've just substituted into the calculations
32 bits * 6 - Float
9 bits * 2 - 9 Bit Integer
Compression Ratio: 85%
Chunks: 10,000
Total Bits per Chunk: 210 bits
Total Bits for Chunks: 2,100,000
Total Bits Compressed: 315,000
Typical Ethernet MTU = 1500 bytes = 12000 bits
Data Frames per Initial Explosion of 10,000 Chunks: 27
Typical UDP Overhead = 224 bits
Total Overhead per Explosion = 6048 bits
Total Bits Needing to Be Sent Per Explosion: 321,048
Throughput Needed Per Initial Explosion: 313Kbps
All of Chunks Collide in 4 seconds: 2500 Chunks re-drawn every second
2500*210 = 525000
Compressed: 78750 bits
Data Frames per second needed for re-draw: 7
UDP Overhead = 1568 bits
Total Bits Needed per re-draw: 80318 bits
Throughput Needed per re-draw: 78kbps
Overall throughput needed in the first second: 391kbps
Every second after initial explosion would be: 78kbps

For the data, I've used float values for the X,Y,Z starting co-ordinates and the same for the finishing co-ordinates of the path on the map. I've assigned 9 bit integers for the rotation values on the path and the radius of the arc of the path.

The compression used is a given in this scenario. With the data being compressed featuring purely floats/ints the compression is very high, in around the 80's which I've substituted in.

To compare this to services which are used daily, for example Netflix, which uses 7Mbps for a Super HD stream which is pretty much standard these days. Next-gen consoles, previous gen support Super HD.

Latency
Average RTT (Round Trip Time) to Azure: 40ms
Calculation Time at Server: 32ms (For 32FPS)
Total RTT = 72ms
In Seconds = 0.072 Seconds

That means it takes 0.072 seconds from the beginning of the explosion for it to come back and start happening on your screen. Once the first load has occurred, you only have to receive data if the chunks collide with anything which would result in the re-draw of paths. The latency on that would be the calculation time, call it 16ms which is a lot considering that only a few may have to be-drawn. Then, add the half trip time of 20ms which would result in waiting 36ms or 0.036 seconds before the re-drawn path gets updated on-screen.

Packet Loss
In regards to packet loss, in 2014, you simply don't have any. ISPs these days tend to be both Tier 3 and 2 with peering often directly to large services which make up a lot of the bandwidth. This includes services like Google, Facebook, Twitter, Netflix etc. Honestly, unless you have a poor wireless signal inside your house which often causes some slight packet loss, you're not going to get any. Even if you drop a couple of packets, you'd loose a handful of chunks for that one frame and in-terms of gameplay it's not going to be noticeable really.

Conclusion
After taking suggestions on-board and drawing paths rather than real-time chunk calculation, the data rates which are needed there a significantly lower and the requirements for the internet connection are perfectly acceptable with only needing to transmit at 391kbps.
If anyones got any suggestions how to increase accuracy, or anything, let me know.

The OLD solution which requires 5.8Mbps is documented here:

http://pastebin.com/vQQs5ffZ

TL;DR: Cloud computing is definitely feasible on normal ISP connections. Would require 391kbps when the explosion starts.

http://www.reddit.com/r/xboxone/comments/27yczf/i_just_calculated_an_estimate_of_the_internet/

I reformatted the quote to make it easier to read.

Tommy McClain
 
Seen this on Reddit & thought it might spur some good technical conversation...



http://www.reddit.com/r/xboxone/comments/27yczf/i_just_calculated_an_estimate_of_the_internet/

I reformatted the quote to make it easier to read.

Tommy McClain

The reddit author mentions drawing paths to reduce the amount of data but I was reminded that Spencer said the local machine didn't need to know anything about what was coming up in terms future states of the debris.

'Okay I need to render this piece at this particular location. I don't know why.' The local box doesn't know why it's going to be at this location or where it's going to be in the next frame.

So it almost looks more like a streaming data solution with no predictive part to it per se if Spencer is being quoted correctly.

There seems to be an update that the original author seems to think helps:

http://www.reddit.com/r/xboxone/com...alculated_an_estimate_of_the_internet/ci63fph

... and even here he/she makes assumptions about how the XBO can have "knowledge" of future events at least for part of transactions.

For the one's that will be making collisions , at least those that can be predicted and be known to happen with complete certainty, the system can detail which objects will have a collision, the variables that need adjusting, and even when to apply those changes. So if the server can tell that some object is going to hit something after a while, and there is some spare bandwidth, it can tell the XBO that when a certain object reaches a certain point, or at a certain time, or some other criteria, to make a specific set of adjustments, and the XBO will then be primed and ready to make that change when it needs to.

Still interesting though :D
 
Back
Top