Server based game augmentations. The transition to cloud. Really possible?

Link: http://www.vg247.com/2013/05/28/xbo...f-three-consoles-on-the-cloud-says-microsoft/

And it's 300% 'more' actually. 400% of 8 = 300% 'more than' 8 ;)

“We’re provisioning for developers for every physical Xbox One we build, we’re provisioning the CPU and storage equivalent of three Xbox Ones on the cloud,” he said. “We’re doing that flat out so that any game developer can assume that there’s roughly three times the resources immediately available to their game, so they can build bigger, persistent levels that are more inclusive for players.

“They can do that out of the gate.”

right. doesn't rule out anything else technically. And the out of the gate comment could imply more to come.

Anyway CPU power is a lot different beast than GPU power, it doesn't depend on flops. Having 4X the CPU is a whole lot more important than just adding 300 GPU Gflops would be. It's taking one of the two or three most fundamental components of the system and quadrupling that resource.
 
Cloud enhancing games would make sense if the game as whole was streamed from cloud. When you are running some parts here and other parts there, you have to be in sync. What is currently implied requires extraordinary amount of work from developers synchronizing data, making sure the *variable* latency (not even predictable) doesn't break the game / visuals. Too much work and time better spent optimizing your local code and game design.

All possible uses that I've read and at least makes a little sense so far (didn't read the entire thread here but I'm also reading stuff elsewhere) depends on static worlds.

Pre-baking lighting so that you can have different time of day, for example. If your game has different time of day, you may as well go with appropriate realtime lighting solutions. You'll also get this hefty bonus of dynamic meshes on your world. And if all those different "times of day" is too big to put on disc, it is too big to download, and for what, static worlds. And also, if your world is static, there are precomputed solutions for ambient occlusion and even radiance transfer, which you can store once, and use it for very different lighting situations.

Anything AI related: Not every decision has to be made every frame, I get that. But that's precisely why it is convenient to do it locally. No need to sync the current game state, share the physics result, and player interaction with any remote location. This also means a copy of the game should be running remotely for those decisions to be made. Waste of computing cycles if you ask me, and too much work to sync both game simulations.

But I think the way Microsoft is touting this will pay-off. The "infinite power" of the cloud will entice a lot of people.
 
Last edited by a moderator:
Link: http://www.vg247.com/2013/05/28/xbo...f-three-consoles-on-the-cloud-says-microsoft/

And it's 300% 'more' actually. 400% of 8 = 300% 'more than' 8 ;)

And it's slow CPU power compared to the onboard CPU for all the reasons mentioned a million times. Cloud computing is very good at handling extremely many users at the same time, but they do it slow compared to local resources. However the good thing is that they only have to allocate what amounts to a single i5 for every XBOX1. So maybe they can keep it fast enough, though the memory access it likely to be slow as "hell" compared to the consoles. 20GB of ram in the cloud is not likely to be local.

Worst thing is that no matter how much CPU power you have in the cloud, the quality of your graphics will still rely on your GPU. The WII U will still have to display the graphics with it's GPU, now granted you could do the cell trick and use some of CPU power to life a weak GPU like in the U and XBOX1, but it's not easy.

As i read this thread i am pretty confident that the number of games that will use this is limited to 1st party games from either Nintendo,Sony or Microsoft. The technology developed with this could be used in the future (if deemed feasible). But as i have said before, if this was easy to do, and something that could harvest great gains it's strange that others didn't do it before. Other software developers have been involved in online games far longer than Sony,Nintendo or Microsoft. Google has the biggest cloud out there and we haven't seen anything from them.. maybe they are getting old :)

I think we still lack a well written example of cloud computing lifting work from a console game that makes it worth the effort. Maybe some of the technical post mortems on Killzone, Halo etc could be dissected and injected with cloud computing..
 
Cloud enhancing games would make sense if the game as whole was streamed from cloud. When you are running some parts here and other parts there, you have to be in sync. What is currently implied requires extra-ordinary amount of work from developers synchronizing data, making sure the *variable* latency (not even predicable) doesn't break the game / visuals. Too much work and time better spent optimizing your local code and game design.

All possible uses that I've read and at least makes a little sense so far (didn't read the entire thread here but I'm also reading stuff elsewhere) depends on static worlds.

Pre-baking lighting so that you can have different time of day, for example. If your game has different time of day, you may as well go with an appropriate realtime lighting solutions. You'll also get this hefty bonus of dynamic meshes on your world. And all those different "times of day" is too big to put on disc, it is too big to download, and for what, static worlds. And also, if your world is static, there are pre-computed solutions for ambient occlusion and even radiance transfer, which you can store once, and use it for very different lighting situations.

Anything AI related: Not every decision has to be made every frame, I get that. But that's precisely why it is convenient to do it locally. No need to sync the current game state, share the physics result, and player interaction with any remote location. This also means a copy of the game should be running remotely for those decisions to be made. Waste of computing cycles if you ask me, and too much work to sync both game simulations.

But I think the way Microsoft is touting this will pay-off. The "infinite power" of the cloud will entice a lot of people.

Finally someone speaks with reason.
 
Well thats what the DF even said, as what's possible in the cloud improves due to better internet speeds, what's possible locally also improves as hardware improves.
This isn't just about technology it's about economics...resources..time management.
A developer has to ask themselves "if I can come up with an easier to implement local solution I can count on every time,even it is only 80% as good the cloud solution if it's 60% easier and cheaper to implement and solves a problem that most people won't obviously notice anyway which solution do I choose?"
 
Cloud enhancing games would make sense if the game as whole was streamed from cloud. When you are running some parts here and other parts there, you have to be in sync. What is currently implied requires extra-ordinary amount of work from developers synchronizing data, making sure the *variable* latency (not even predicable) doesn't break the game / visuals. Too much work and time better spent optimizing your local code and game design.

What do you mean?
 
What do you mean?

When the whole game runs 100% locally or 100% through cloud the dev is not worried about syncing aspects of the game. When a game runs partly locally and partly in the cloud, the dev has to select what to render in the cloud and make sure that whatever is rendered in the cloud and locally are in sync.
 
I got that part.
What I don't understand is the first sentence, the one about cloud enhancing making sense if the game is streamed form the cloud.

Wouldn't cloud enhancing be redundant since the game is already running/streaming 100% on/from the cloud?
 
Last edited by a moderator:
If there aren't very many relevant use cases then why would Microsoft mention it? Why would they invest so much into something which flat-out isn't going to work for many uses? The only thing I can think of is if they put some ridiculously high I.Q. out in some E3 demos*.

*Only works with the cloud.
 
I got that part.
What I don't understand is the first sentence, the one about cloud enhancing making sense if the game is streamed form the cloud.

Wouldn't enhancing be redundant since the game is already running/streaming 100% on/from the cloud?

I chose the wording because of the title of the thread, and the redundancy of the whole thing is apparent, at least in my eyes. Just a little irony, too concealed to work in text, probably.

Having said that, the "enhancing" may still be applicable as you could in theory, increase the capacity of the cloud and the games could make use of it.

If there aren't very many relevant use cases then why would Microsoft mention it? Why would they invest so much into something which flat-out isn't going to work for many uses?

Marketing the perception that you won't have to upgrade your hardware as they can give you more power through the cloud. It really doesn't have to work, they can hold a carrot in front of us when we are on a treadmill, believing their promises will come through if we give them enough time.
 
Last edited by a moderator:
Oh boy, you just love that dont you :rolleyes:

Nowhere did MS say a flops number, or that there wouldn't be GPU resources in the cloud, or anything. They simply said in one interview that three Xbones of CPU+storage would be provisioned as a minimum, but they never ruled out anything else. You sticking to it as some super hard set in stone rule is funny, but not unlike you.

In fact in the other interview guy clearly talked about GPU effects like lighting and SSAO being done in cloud.

They also said Xbone without cloud=10x 360 and Xbone with cloud=40x 360, So if I be super literal like you, I deem they have 3.9 teraflops in the cloud for every Xbone :rolleyes: They could not have been talking about CPU flops. Because x360 CPU=100gflops, and Xbone CPU=100 gflops, therefore if only talking cpu flops as you are, xbone cannot be 10x 360.. what now mr super literal?

What CPU do you think is in the box :LOL:? some super duper 4x FLOPS Jaguar?. Honest question because i was just extrapolating data based on the most likely data that we have. We all know most of the numbers are PR fluff anyway so why should we take them literally?.

If you really think Microsoft is going to invest enough money for Petaflop level of performance just for the XBONE go ahead and believe it, but its not likely, the cost would be in the hundreds of millions probably.

Also why are you jumping on me just for posting more data then Microsoft/Most likely ever will?. We won't receive actual data for them because it would still look bad, if Sony can go well even with the cloud Microsoft still doesnt match us then.... .Its in Microsofts best interest to keep everyone they possibly can in the general public in the dark about the specifications that way the super duper rumours can continue.

If they ever release the total specs (flops, clock rate, etc) I would be very surprised.
 
Last edited by a moderator:
The fact that the destruction physics had to interact with every player on the server as fast as possible. You cannot have it lagging for a couple seconds and then someone suddenly appears inside it and dies because the destruction lagged.

...

Only thing I'd like to point out with this is the "couple seconds" remark. Are we talking a couple of seconds or a couple of hundred milliseconds? If you were to do some physics in the cloud, I'd expect the computations to happen within milliseconds, and the only delay to the client would be the latency of the network. There may be some tolerance to physics latency, depending on what is being done. If you were playing battlefield and you used a tank to blow a hole in the side of the building, would you notice if the wall exploded 100ms after the shell was supposed to have hit? I don't know the answer to that. 200ms would definitely be perceptible. There could be some dynamic player-influenced physics where the latency may not be that noticeable. The buildings falling over in Battlefield is probably a case where you wouldn't notice latency.
 
...
But to get back on topic. There is only so much power in the cloud don't forget, that being 300GFLOPS, 24 cores of a 1.6ghz Jaguar CPU. A lot of this stuff that is done on it is also probably possibly on GPGPU so I don't think we will see it making a great deal of difference to what you can see on screen, after all if its so latency insensitive whats to stop you dedicating some GPU time to each tick and working on it in parts?.
..

The whole point of the theory of using the cloud to aid in computation is that you can process more than what you could process locally. I'm sure most of the things could be processed locally, either on the GPU or the CPU, but you don't have infinite power locally (or in the cloud). You wouldn't always be able to find spare cycles to fit in processing for more things. The cloud gives you a little extra (or as their PR claims, a lot).
 
There are already latency issues with destruction in Battlefield so I can't accept this as a viable use case. In BF the buildings issue a 'groan' sound to warn of impending collapse and I have run into many silent buildings only to insta-die because that building was destroyed in another players client.

Can we really isolate the physics of collapse from the physics of projectile hits? I've had quite a few flukey heli and vehicle kills because they came between me and my target.
 
The whole point of the theory of using the cloud to aid in computation is that you can process more than what you could process locally. I'm sure most of the things could be processed locally, either on the GPU or the CPU, but you don't have infinite power locally (or in the cloud). You wouldn't always be able to find spare cycles to fit in processing for more things. The cloud gives you a little extra (or as their PR claims, a lot).

This is true. But I cannot help but think that a bunch of things that are being suggested could be done easier and quicker on the local machine, a lot quicker if they were done well. It seems like the cloud would be useful for things that the player does not directly interact with, or things that require a large of amount of data (market simulations?, player interactions) but I swear the same things are being suggested over and over again and nearly all of them see to involve player interaction I cannot see this working due to latency issues. Something that no one has addressed yet and something I do not think will improve anywhere near the rate that bandwidth will.
 
And it's slow CPU power compared to the onboard CPU for all the reasons mentioned a million times. Cloud computing is very good at handling extremely many users at the same time, but they do it slow compared to local resources. However the good thing is that they only have to allocate what amounts to a single i5 for every XBOX1. So maybe they can keep it fast enough, though the memory access it likely to be slow as "hell" compared to the consoles. 20GB of ram in the cloud is not likely to be local.

Worst thing is that no matter how much CPU power you have in the cloud, the quality of your graphics will still rely on your GPU. The WII U will still have to display the graphics with it's GPU, now granted you could do the cell trick and use some of CPU power to life a weak GPU like in the U and XBOX1, but it's not easy.

As i read this thread i am pretty confident that the number of games that will use this is limited to 1st party games from either Nintendo,Sony or Microsoft. The technology developed with this could be used in the future (if deemed feasible). But as i have said before, if this was easy to do, and something that could harvest great gains it's strange that others didn't do it before. Other software developers have been involved in online games far longer than Sony,Nintendo or Microsoft. Google has the biggest cloud out there and we haven't seen anything from them.. maybe they are getting old :)

I think we still lack a well written example of cloud computing lifting work from a console game that makes it worth the effort. Maybe some of the technical post mortems on Killzone, Halo etc could be dissected and injected with cloud computing..

Why is the cloud CPU slow? I missed that part. I wouldn't think memory bandwidth in the cloud would be a big issue because they don't have to worry about feeding a GPU.

I'll agree that this is going to be the territory of exclusives, at least for the first couple of years. *IF* it takes off, maybe multiplatforms will look at it down the road.

As for it not having been done before, who had the infrastructure to even try? Maybe Google or maybe Amazon, but when have either of them ever been invested in gaming? Wouldn't surprise me if Sony, Nintendo and Microsoft all had this running in labs for quite a while. The thing is, the barrier to entry is building enormous data centers around the world, and there are very few companies that can do that. In fact, I can only think of 3.
 
There are already latency issues with destruction in Battlefield so I can't accept this as a viable use case. In BF the buildings issue a 'groan' sound to warn of impending collapse and I have run into many silent buildings only to insta-die because that building was destroyed in another players client.

Can we really isolate the physics of collapse from the physics of projectile hits? I've had quite a few flukey heli and vehicle kills because they came between me and my target.

I don't think Battlefield will make use of Microsoft's cloud plans at all, so it doesn't really matter. Just an example that seemed appropriate. Would you really notice if you blew up the wall of a building and the building started to fall down 100ms late? 200ms late? For syncing projectiles vs destruction, there could be issues. That would actually be one benefit of pushing all of the destruction physics to the cloud. You can sync all of the clients faster.
 
Back
Top