Server based game augmentations. The transition to cloud. Really possible?

Do you really think they'd try 10,000 blades of grass in a 10cm x 10cm square?
Prolly not as I've explained it'll be more expensive doing it on a server than in the local machine
the 10,000 blades of grass in 10cm x 10cm is what theyre doing above in the gif

I explained why its a bad concept (requires massive amounts of internet bandwidth) if you're gonna demo a proff of concept show something thats actually useful, do you remember all the kinect 1 & 2 proff of concepts, how well did they work out

heres a proof of concept I done on CPU
sorry about the terrible youtube quality (back in the old days when youtube only had crap quality)

the above gif is worse than what todays GPU's can easily do without breaking a sweat, if they want to showcase something that todays GPU's can't do than do, then do eg 1000s of golf balls rolling on a rugby field (@ 10,000 blades per 10cm x 10cm)
could a cloud of CPU/GPU's do this? yes
could it send the result over the internet 12x a sec?
nope (too much data)
the only way it can be done is if they send the final screenresult
 
Interesting example on how cloud progressing can help.

Grass is a good use case because if cloud->client data packets get lost, delayed, etc then grass can still be approximated client side with whatever data it does receive from the cloud servers. It's one of those cases where a little error here and there won't matter unlike with stuff like animations where fingers and bones can get contorted into unnatural positions if the data isn't just right.
 
Shifty if its doing it on the cloud, I assume theyre doing some decent collision...who heres got a ~1TB/sec internet connection!
Exactly! The fact that the data requirements for using exact data are completely unrealistic shows that's not what they're doing (unless they're complete idiots and when they release the tech, are surprised to find no-one has a fast enough connection. Is that what you're anticipating?). Ergo, they have to be doing something else that work with the limits of a typical internet connection. Now imagine a field of grass and a load of bodies walking across it. Moving the physics to the cloud saves the local console from having to spend any resources solving that, while a texture or similar could be sent to the console to tweak the rendering parameters and draw the grass in correct orientation. Now expand that to things like footstep and tire tread calculation, sending back pieces of computed deformation maps for the local console to apply to the geometry, or whatever.

This thread has done a great job of identifying the limits in raw-data holding back remote compute. Its future is to watch what devs do do with it, whether as cloud augmentations or server-based computing, and how they solve these limits and find ways to provide useful results from remote computing over very thin pipes. Seeing a few examples that devs are presenting, though without details needed to be confident in their work, shows we should be looking at solving the issues ourselves rather than just repeating the poo-pooing of cloud computing.
 
what do people consider a reasonable "stream" bandwidth in cloud physics application in terms of customer? Netflix streams at 1.5Mbps and up.
 
what do people consider a reasonable "stream" bandwidth in cloud physics application in terms of customer? Netflix streams at 1.5Mbps and up.

I would think you wouldn't want to go too high. 1 - 1.5 Mbit maybe. Something like a good quality SD stream.
 
what do people consider a reasonable "stream" bandwidth in cloud physics application in terms of customer? Netflix streams at 1.5Mbps and up.
Netflix buffers and video is latency tolerant. Certainly in the UK and Europe ADSL is still very popular and you have modest downrates but terrible uprates. My connection can barely muster 100kb/sec which for a 60fps game is less than 2kb/frame.

Not everything that people may want to offload to cloud servers will be constrained to up/down rates on a frame-by-frame basis but it's still not good. BTW, in ADSL terms, I have a "fast" line :yep2: To go faster I need fibre, which I still can't get. :cry:
 
Netflix buffers and video is latency tolerant. Certainly in the UK and Europe ADSL is still very popular and you have modest downrates but terrible uprates. My connection can barely muster 100kb/sec which for a 60fps game is less than 2kb/frame.

Not everything that people may want to offload to cloud servers will be constrained to up/down rates on a frame-by-frame basis but it's still not good. BTW, in ADSL terms, I have a "fast" line :yep2: To go faster I need fibre, which I still can't get. :cry:

The physics demo above looks like it's running at 12Hz on the client. At 1 Mbit you'd get 25 kilobytes per update. The simulation on the cloud would probably run at something like 120Hz or 240Hz. The server would also know exactly which physics objects need to be updated on the client side, so you'd be able to limit the data to only the necessary objects. Some LOD systems based on draw distance etc could also limit the amount of data that would need to be sent with each update to the client.
 
The server would also know exactly which physics objects need to be updated on the client side, so you'd be able to limit the data to only the necessary objects. Some LOD systems based on draw distance etc could also limit the amount of data that would need to be sent with each update to the client.
If you can reduce, or eliminate, the amount of data to go from the console to server that'd increase it's real world usage in a lot of places using ADSL.

I don't know if the LOD approach would work vis-à-vis cloud compute. I imagine that games can only offload what is strictly required by the game - at least without paying Microsoft a dime. I'm sure Microsoft probably have this mandated in their Xbox / AzureT&C.
 
If you can reduce, or eliminate, the amount of data to go from the console to server that'd increase it's real world usage in a lot of places using ADSL.

I don't know if the LOD approach would work vis-à-vis cloud compute. I imagine that games can only offload what is strictly required by the game - at least without paying Microsoft a dime. I'm sure Microsoft probably have this mandated in their Xbox / AzureT&C.

I wouldn't think you'd be sending too much data from the client to the server. The physics simulation would run in the cloud. Most of the data would go the other way. With something like grass moving based on collisions with another object, you could play with the effect off if you don't have a network connection. It's not something that impacts the gameplay.

In any case, I'd expect this to be used more for multiplayer games.
 
I wouldn't think you'd be sending too much data from the client to the server. The physics simulation would run in the cloud. Most of the data would go the other way. With something like grass moving based on collisions with another object, you could play with the effect off if you don't have a network connection. It's not something that impacts the gameplay.
Perhaps. If both console and server start off with a fixed known state of the world, physics calculated on the sever can simply to sent to the console and both remain in sync with perhaps a small permissible deviation that won't affect gameplay.

Now what if I save the game and turn off the console for two weeks. The server then needs to know about the state of your world from 2 weeks ago which could be a big upload, depending on exactly what you're off-loading to the server and how much data needs to be uploaded before console and server are in sync.

There remain a lot of unknowns and problems to solve which is why outside of MMOs, massive cloud-based processing in games really isn't a thing.
 
Netflix buffers and video is latency tolerant. Certainly in the UK and Europe ADSL is still very popular and you have modest downrates but terrible uprates. My connection can barely muster 100kb/sec which for a 60fps game is less than 2kb/frame.

Not everything that people may want to offload to cloud servers will be constrained to up/down rates on a frame-by-frame basis but it's still not good. BTW, in ADSL terms, I have a "fast" line :yep2: To go faster I need fibre, which I still can't get. :cry:

Sorry but 100kb/s in London? Where in London?? I had around 15Mb with my old adsl...
 
Sorry but 100kb/s in London? Where in London??
SE16. Go look up our fibre deployment status on BT's website. It's been stuck at 'AO' (Accepting Orders) for over a year.

I have a 20mbit DSL connection, in reality that is 18mbit down (1.9Mb/sec in real terms) and 1kbit up (100k/sec in real terms). And BT are the provider offering the best rates in my area.
 
SE16. Go look up our fibre deployment status on BT's website. It's been stuck at 'AO' (Accepting Orders) for over a year.

I have a 20mbit DSL connection, in reality that is 18mbit down (1.9Mb/sec in real terms) and 1kbit up (100k/sec in real terms). And BT are the provider offering the best rates in my area.

Ah Kb vs KB, I get it now. I live 50 yards from the exchange and I was also waiting for years. Fiber will come to you eventually :D
 
what do people consider a reasonable "stream" bandwidth in cloud physics application in terms of customer? Netflix streams at 1.5Mbps and up.

Well, Microsoft were originally recommending a 1.5 Mbps connection when the console was going to require an online connection. So I'd imagine 1.5 Mbps would be the absolute maximum anyone would try to use for a mass market game. Where I live, (small city pop of less than 350k) it is impossible to get less than 1.5 Mbps.

Regards,
SB
 
what do people consider a reasonable "stream" bandwidth in cloud physics application in terms of customer? Netflix streams at 1.5Mbps and up.
You could go by market averages. I don't think 4 Mbps is unreasonable. But it doesn't really matter - the BW's are so small as to make the difference between 1 Mbps and 10 Mbps pretty irrelevant when it comes to designing data formats that'll work in realtime. It'll be like network gaming on 28 kbps all over again, trying to condense the data down to the bare minimum and work away whatever you squeeze down the phone lines.
 
Grass is a good use case because if cloud->client data packets get lost, delayed, etc then grass can still be approximated client side with whatever data it does receive from the cloud servers. It's one of those cases where a little error here and there won't matter unlike with stuff like animations where fingers and bones can get contorted into unnatural positions if the data isn't just right.

Raises another question, if it's something where it doesn't really matter if the data is exact or perfect then what is gained compared to a much simple simulation.

Still not convinced :)
 
Raises another question, if it's something where it doesn't really matter if the data is exact or perfect then what is gained compared to a much simple simulation.

A question that can be asked about the entire field of 3D graphics ... :eek:

(You already know the answer!).
 
Raises another question, if it's something where it doesn't really matter if the data is exact or perfect then what is gained compared to a much simple simulation.

Still not convinced :)

Very little impact (potentially approaching zero) on the host system regardless of whether you implement the same amount of "stuff" or more "stuff." Versus dedicated CPU and/or GPU compute time for "stuff."

That means you can then allocate more time (resources) to other "stuff" on the host system that is directly impacted by a player's movement, actions, or controls.

So in other words, by moving the "flavor" or ambient "stuff" to the cloud, you increase your ability to implement game affecting "stuff" better or in greater volume (more of it).

At least that is the potential, and why it is so attractive to some game developers.

Regards,
SB
 
DeLorean: Using Speculation to Enable Low-Latency Continuous Interaction for Cloud Gaming
http://research.microsoft.com/apps/pubs/default.aspx?id=226843

A paper on using 'speculative' prediction to reduce percieved latency when used in cloud gaming. In this case 'cloud gaming' is the OnLive/PSNow style of remote gaming where the local client is essentially capturing input and decoding video and audio from the remote session. Not sure how viable this is for all titles, neither Doom 3 nor Fable 3 strike me as being that 'twitch', but it could offer a lot for certain titles.`

Things that jump out at me are their idea of 'Mispredicition Compensation' which seems like it would increase the load on the host server by a non-trivial amount per user. 'State Space Subsampling and Time Shifting' sounds fascinating but flies right over my head (so many arctans et al. :D )

I saw this on NeoGAF posted by user Kayant
 
Back
Top