Server based game augmentations. The transition to cloud. Really possible?

Dead Island 2 for the Xbox One will support 8 players in Co-op mode thanks to the cloud.

http://news.softpedia.com/news/Xbox...land-2-to-Support-8-Player-Co-Op-458549.shtml

Here's the link to the original story, which seems to suggest the same thing. http://www.totalxbox.com/81068/yage...adcount-after-seeing-xbox-ones-cloud-support/

They do state the PS4 will run the same number of players, which makes me wonder if the PS4 versions servers are running on Azure. It's strongly implied the Xbox version will. It might sound counter-intuitive, but taking server business for PS4 games from competing platforms like Amazon might actually be a smart move. Doesn't seem like it from an Xbox point of view, but from an overall business move for Microsoft it's good.
 
Why would 8 player co-op need the cloud, certainly this is doable with 1990s technology? Did Resistance 2 need any special or just P2P?
 
Time to bring this thread back alive. Kampfield whom posted on neogaf was actually hired onto the Crackdown team.
His latest post is generally getting feedback about what they feel Crackdown 3 should be about, but it eventually runs the course of Cloud is not a thing rabbit hole.

His posts on the topic of cloud:
http://www.neogaf.com/forum/showpost.php?p=148930358&postcount=270
That stupid cloud talk again ;) I know you can not hear it anymore. What I want to say though is: your local gaming mashine does not have enough calculation power to run our game engine. I think we always communicated it that way and there is no reason not to be honest here. If you want to deliver something that has never been done before in that kind of scale, then you can not make compromises. We totally know that there are huge risks involved. But this is the route we will take. We are super confident here and can't wait to get out of the dark.

http://www.neogaf.com/forum/showpost.php?p=148933838&postcount=307
People want to see what cloud computing can be used to in gaming. So give a chance at least on showing this. I think that would be fair.
 
Can a server run a full persistent model of a given level, world or area of a game. Day/night cycle,weather, destruction. Everything. And each console taps into that model for updates and physics calculations based on their deeds and progress saved on their hard drive. It would seem the server updates would be sort of inline with their asteroid demo. This may have been impossible in the past. But maybe they are bringing something new.
 
I think so yes. You could I suppose I'm not sure how or what is the most efficient way to do it. I imagine that slowing things down before they happen will slow the server to perform a lot of calculations well before you require them

For instance firing a rocket at a building. Travel time is X seconds for it to reach its target but since you know the final trajectory of the rocket you could immediately start calculating its destruction effects on the building ahead of time and wait for the point of no return that it was impacted and begin streaming the physics data down.

There are better more knowledgeable people on this subject than I, with me limited knowledge of the subject that might be how I would attempt both AI and physics remotely
 
For instance firing a rocket at a building. Travel time is X seconds for it to reach its target but since you know the final trajectory of the rocket you could immediately start calculating its destruction effects on the building ahead of time and wait for the point of no return that it was impacted and begin streaming the physics data down.
And even sudden explosions causing destructive breakup of breakup can be faked with imprecise physics calculations locally for enough frames to wait for the remote server to provide the 'canonical' data. But I don't think these type of things are that stressful to beyond current consoles. You've have to be simulating tens or hundreds of thousands of individually reacting elements which as likely to cause slow down on the graphics engine even if physics are offloaded.

I'm curious to see what Crackdown 3 ends up offloading to the cloud. Hopefully they'll be a GDC or Develop seminar.
 
For instance firing a rocket at a building. Travel time is X seconds for it to reach its target but since you know the final trajectory of the rocket you could immediately start calculating its destruction effects on the building ahead of time and wait for the point of no return that it was impacted and begin streaming the physics data down.
That example has been raised in this thread already. What if someone jumps in front of the rocket? Then the damage needs to be applied in real time exactly where the rocket detonates.

The main purpose of cloud computing is stupid amounts of processing power. There should be enough to calculate damage immediately and start steaming the results. Pre-caching of results shouldn't be necessary. You only need that if the BW required to send the changes exceeds connection speed. Then you'd need to start downloading ahead of when the data is needed.
 
That example has been raised in this thread already. What if someone jumps in front of the rocket? Then the damage needs to be applied in real time exactly where the rocket detonates.

I wonder if they could do something like calculate a number of results based on the rockets trajectory, and pick the one closest to where the impact occurs. That could get a little crazy in a multiplayer game, where there could be many rockets firing all over the place, and even impacting at close to the same location.

Just spitballing. Really, it's up to the Crackdown guys to deliver. A server could also have a database for precomputed results. Maybe they'd have a hybrid realtime-precomputed system.
 
if you're going with precomputed results, you don't need cloud computing - only cloud storage.

Well, say you had the trajectory of a rocket that travelled down a narrow alleyway. At the end of the alleyway is a street, and across the street is a building. You might be able to compute the damage to the building at impact, and then decide if you want to play a pre-calculated result for the entire side of the building coming down, if the building has taken enough damage. You could also calculate the results at intervals along the trajectory. Maybe there are trash bins, parked cars that would have to move if the rocket for some reason detonated early. You could compute damage to the walls in the alley, and also decide whether there are any pre-computed results you want to trigger. Just an idea.

You could also constrain the calculations by the distance the rocket can travel before the next netcode tick.

And when I say could, I mean theoretically, not that I think they could necessarily do any of this.
 
That example has been raised in this thread already. What if someone jumps in front of the rocket? Then the damage needs to be applied in real time exactly where the rocket detonates.

The main purpose of cloud computing is stupid amounts of processing power. There should be enough to calculate damage immediately and start steaming the results. Pre-caching of results shouldn't be necessary. You only need that if the BW required to send the changes exceeds connection speed. Then you'd need to start downloading ahead of when the data is needed.
You might be right I'm just thinking out loud here. I don't know how fast cloud compute is; I'm just thinking how you are going to get around that 33ms latency. I mean if you are sending the vectors of everything to the server you should be able to determine a point of no return in which this rocket becomes committed to impacting something, 2-3 frames before impact to start the calculations or something like that. But if it could happen frame of impact that would be impressively fast.
 
I did bring this up before. It just seems that people actually working on these things keep saying they are able to do things not seen before. Assuming they aren't just lying through their teeth, something has changed in the way these things are done and we aren't allowed to see it yet.

The asteroid demo and Build(?) demo couldn't have been out and out lies. Maybe they have speculated too much could happen to make this happen, but they at least must have something.

But how much information really needs to be sent in a scenario like that? The console could simply send a firing solution to the server once a missile is fired. It could send a report when a person changes weapons so the server already has part of the equation. The server may only send the updates to the world but it may still contain all of the information from the game like a dedicated server. I no latency is the main issue but I thought Orleans was about helping with that. Orleans and the design of Xbox One.
 
I suggest you read through this thread. All the ideas have been discussed including possible applications and theoretical solutions to tackle the bandwidth limits. eg. None of the demos shown so far have had the details of their connection made public, leaving reason to doubt they've shown anything working over a typical internet connection. The devs don't need to be outright lying to be untruthful. Much PR in every sector is based in fact but doesn't lead to the promised benefits. This wouldn't be the first time a promised revolutionary tech failed to deliver (though that doesn't disprove it - it only disproves the assertion that it must be working because the devs say it is). For example, Peter Molyneux is the poster child for undelivered promises. ;)

Also, latency isn't the major issue. Bandwidth is. Games can operate with 100 ms latency as found in DF analyses (button press to screen update), and for big events like building destruction, 100+ ms is probably not noticeable, especially with a flash and cloud occluding events at first. At 2 MB/s internet bandwidth (16 Mbps, upper end of average internet speeds), 100 ms affords you 200 kilobytes of data. Getting decent amounts of information into that tiny package is going to be the magic. And the economies and complexities of cloud computing are going to impact how much devs want to invest into researching solutions for these problems.
 
I did bring this up before. It just seems that people actually working on these things keep saying they are able to do things not seen before. Assuming they aren't just lying through their teeth, something has changed in the way these things are done and we aren't allowed to see it yet.

The asteroid demo and Build(?) demo couldn't have been out and out lies. Maybe they have speculated too much could happen to make this happen, but they at least must have something.

The asteroid demo and a later one that showed blowing up a building with a missile launcher both seemed to use the "cloud" as an accurate physics value add. Meaning there are many times where shortcuts are done on say local machines that look good enough ( not too many variables and simpler equations or something like that ) vs a more nuanced and accurate portrayal of falling debris as an example. That could be considered something not seen before but it is not a product.

There are many firsts with the kinect 2.0 in terms of capabilities so saying that the kinect creates new functionality not seen before would not be a lie but it doesn't mean you have a product either.

We have a decent idea of what happens when you are essentially playing a video stream in terms of performance ( PS Now ) and we know how single player games are made now without huge amounts of data being tossed around ( ai stuff ) but how things are done in between these extremes is not an easy thing to envision assuming folks expect things to look the same as they do now when it comes to frames per second and image quality.
 
Last edited:
The asteroid demo and a later one that showed blowing up a building with a missile launcher both seemed to use the "cloud" as an accurate physics value add. Meaning there are many times where shortcuts are done on say local machines that look good enough ( not too many variables and simpler equations or something like that ) vs a more nuanced and accurate portrayal of falling debris as an example. That could be considered something not seen before but it is not a product.

The building explosion demo wasn't about simplified versus robust calculations, but about the scale of what can be calculated on a single machine (which has to run everything else in addition to the explosion and debris physics calculations) versus "the cloud" (which only has to deal with the explosion and debris physics calculations and nothing else) with significantly more processing power.

And consider that the single machine they used likely had significantly more computational power than the XBO and it gets easier to see why the Crackdown team would like to leverage the cloud, if possible.

Regards,
SB
 
Also, latency isn't the major issue. Bandwidth is. Games can operate with 100 ms latency as found in DF analyses (button press to screen update), and for big events like building destruction, 100+ ms is probably not noticeable, especially with a flash and cloud occluding events at first. At 2 MB/s internet bandwidth (16 Mbps, upper end of average internet speeds), 100 ms affords you 200 kilobytes of data. Getting decent amounts of information into that tiny package is going to be the magic. And the economies and complexities of cloud computing are going to impact how much devs want to invest into researching solutions for these problems.

bandwidth over time calculation can be nowhere near that simple, especially on ADSL2, there is significant variable serialization and fragment interleaving delay. Any one who plays BF2/3/4 significantly is well aware of what cloud side computation is like, in servers where i am around 10ms of latency i still get shot around corners, press the button on C4 only to die after that without the C4 going off etc. @ 50ms those events happen significantly more. So will it be an issue? im guessing that will come down to the game itself.
 
bandwidth over time calculation can be nowhere near that simple, especially on ADSL2, there is significant variable serialization and fragment interleaving delay. Any one who plays BF2/3/4 significantly is well aware of what cloud side computation is like, in servers where i am around 10ms of latency i still get shot around corners, press the button on C4 only to die after that without the C4 going off etc. @ 50ms those events happen significantly more. So will it be an issue? im guessing that will come down to the game itself.

BF 2 and 3 used dedicated servers, not what we now consider "the cloud." Not sure about BF4. Back in the day I used to play on 120+ ms ping ADSL without those issues on dedicated servers for various MP shooters. Even with 250-350+ ms ping dial-up I didn't experience those issues. That just seems to indicate that BF must suffer from extremely bad netcode, which I find surprising.

Regards,
SB
 
Back
Top