Server based game augmentations. The transition to cloud. Really possible?

This post is very interesting ....with some very interesting ideas about how the cloud may or may not help in game .
I wonder would it be possible to populate a squad based game with bots based and your friends playing style from data collected from there playing style.

Yes/no, and it doesn't necessarily have much to do with the cloud.

AFAICT the options are:
- record the player moving through a level, and replay exactly that. This is great for a 'ghost car' but obviously the AI is fixed, if the environment changes (e.g. you shoot someone/crash into them) then it immediately breaks.

- try to work out "meaning" behind the players action and create an AI based on that. AFAIK this would need to be hand-crafted for each game, and sounds rather impractical. (too much like 'step 1: create a hard AI').

- measure 'some aspects' of a player's gameplay and then match it to a predefined AI with custom parameters. For example 'sniper', 'shooting: 8', 'in cover: 6', 'preferred weapon A, B, C', "doesn't use grenades, always uses skill 3, forgets he has medpacks, prefers flag #1".

The last one sounds entirely do-able. In practice I think a local set of counters would create enough data for that.
 
As above, what's the data format for you grass and buildings that you send to the console? Let's take a hypothetical 8 vertex blade of grass, a textured ribbon. It requires 8 32bit (preferably 64, but we can lose accuracy to save BW) values to represent each vertex position. That's 32 bytes. One thousand blades of grass would require 32 kb of info per frame. A field would require crazy amounts of BW.
You would not want to send foliage/particle positions over the internet connection. That would be a huge waste of bandwidth.

Lets say we want to have high quality wind simulation (simulate air movement by fluid dynamics), and we want to move our foliage and particles (smoke etc) according to the wind. You wouldn't want to send any information about your particles / foliage to the cloud in this scenario. You would want to offload the air flow fluid dynamics calculation to the cloud, generate a low resolution vector field in the server, compress that field, and send that field to the client. Now the client can animate the foliage / particles according to the vector field (preferably using a GPU vertex/compute shader). This is cheap for the client. The heaviest part of the calculation (wind simulation) is offloaded to the cloud. This kind of method also combines very well with local modifiers (player moving though the particle cloud, player stomping on the animating grass). If you do the whole animation in the cloud, you will have latency issues (animation reacts too slowly to player actions).
 
Is wind simulation something that's modeled with any kind of complexity in games, outside flight sims?
Not in most current games. High quality simulation costs too much. But isn't this thread about finding possible use cases for high performance cloud servers :)
 
You would want to offload the air flow fluid dynamics calculation to the cloud, generate a low resolution vector field in the server, compress that field, and send that field to the client.
Thanks. That's the kind of alternative data representation that's at the core of this thread. What can be represented in BW efficient forms? In basic terms I suppose anything that can be represented as a 'texture', 'volume', or 'sound' and can be lossily compressed. Anything that modifies the meshes seems a candidate, but object creation would be the reserve of ahead-of-time computation. So, for example, you could have a mission to set charges in a building and after escaping, watch the whole thing blow according to where you placed the charges. That info could be sent to the cloud and calculated and results sent to the console as you leave the building, downloading over the next few minutes as the level involves your escape to a safe distance.
 
Yes/no, and it doesn't necessarily have much to do with the cloud.

AFAICT the options are:
- record the player moving through a level, and replay exactly that. This is great for a 'ghost car' but obviously the AI is fixed, if the environment changes (e.g. you shoot someone/crash into them) then it immediately breaks.

- try to work out "meaning" behind the players action and create an AI based on that. AFAIK this would need to be hand-crafted for each game, and sounds rather impractical. (too much like 'step 1: create a hard AI').

- measure 'some aspects' of a player's gameplay and then match it to a predefined AI with custom parameters. For example 'sniper', 'shooting: 8', 'in cover: 6', 'preferred weapon A, B, C', "doesn't use grenades, always uses skill 3, forgets he has medpacks, prefers flag #1".

The last one sounds entirely do-able. In practice I think a local set of counters would create enough data for that.

Shame as i'd love to play with a bot with a friends playing style while they were off line ......help there bot beat the score for that level then mercilessly whined them up with the line even your bots a better player than you lol
 
Exactly!

The biggest gain of cloud is dynamic load balancing. Previously developers had to rent/buy huge server farms dedicated for single games (or multiple games by a single publisher). This is expensive and inflexible.
EA, at least recently, has been doing this.

Recently the most discussed issue has been the server capacity problems during big game launches. We all remember Diablo and SimCity. Large scale cloud infrastructure could solve these kinds of problems cost effectively.
Is Amazon's cloud service large-scale enough?
That is what EA is using for SimCity. It doesn't matter how large-scale the cloud is if the cloud portion is constructed with some nasty scalability problems.

The cloud portion should be under some kind of review by Microsoft to avoid said developers from wasting their Azure cycles on crap, which increases the validation requirements for such a game.

This could likely mean that game developers / publishers keep their game servers running for much longer time, as the financial burden scales down as the game popularity decreases over time. It's a pure win-win situation.
While potentially true, I would look to EA as a company that does use the cloud in known instances but whose service shut down schedule can be used to set time.
 
Most of the EA games that have been shut down were sports titles that rented space from Rackspace (?), I believe. My understanding was that they used a more traditional "rent a physical server" model, rather than scalable VMs like in the cloud model. I could be totally wrong.

I'm trying to remember from a post where someone had done some packet sniffing on the NHL games and found they were using dedicated servers. It wasn't hosted by Amazon. Rackspace may be wrong.

As far as I know, Amazon's cloud service should be able to provide the type of scalability that Microsoft's Azure does. Same with Google.

It all comes down to how Microsoft charges for cloud resources. I'm assuming EA would have to pay for X number of VMs of a defined size. Anything beyond that would be outside of their contract. So scaling up to a limit. Who knows how the Microsoft model works. My theory is that they are subsidizing the cost for developers with Xbox Live Gold fees.
 
Not in most current games. High quality simulation costs too much. But isn't this thread about finding possible use cases for high performance cloud servers :)

This is a physical phenomenon similar to my earlier posted case of a cloud-simulated background fountain.

It's a continuous simulation that has a smoothable output that a human player would be hard-pressed to notice smoothing. Much of what it simulates is effectively timeless relative to everything in the game world, with any additional melding done locally.
It's also invisible, so bonus points there.
 
So the background pauses/fades out/all the taxis disappear?

So i think the way they explained it in the Kotaku article is kinda yes... they said with the cloud connection they could view 300K asteroids. When the connection was pulled... youd go back to 30K... which isnt exactly correct math.

the system limited itself to 30K asteroids because those were near field and potentially within the FoV of the player... you could get away with shifting the bulk of those resources into the FoV instead... to minimize the disruption. All that matters is whats in view of the players camera.
 
Just dropping everything from the cloud doesn't seem like a good solution for most cases. Discontinuity like that is something our brains should pick up on quite readily, especially if there's a dodgy history function that lets the simulation flash in and out.

I'm assuming the abrupt drop is just to make the difference more visible. The writer would have less to talk about if the actual result was that over a hundred thousand asteroids subtly shifted their modeling to a fallback method that becomes increasingly divergent from the cloud simulation over the course of seconds or minutes.
 
Much of what it simulates is effectively timeless relative to everything in the game world, with any additional melding done locally.
It is timeless only in simplest cases, where you have a static environment, and static environment conditions (wind direction, wind force, etc). If the use case is that simple, it's better to bake the vector field during production (potentially using cloud *) and store it to the disc. Changing wind conditions are relatively smooth (change takes tens of seconds) and can be interpolated without any noticeable error (not latency sensitive).

(*) Cloud computing is not a new thing for game production tools. Cloud can boost up productivity by generating baked lighting (light maps, light probes) faster than any single workstation. Faster = less waiting = more iterations possible = more polished content.

If you need to regenerate your baked lighting on fly (react to destruction or day cycle changes), cloud servers could potentially be used for this. Generation of high quality light probes costs a lot of GPU/CPU cycles. Light probes stored as spherical harmonics do not take much memory (or bandwidth to transfer). This kind of methods could potentially be used inside the game (and not only in the game creation tools), assuming the cloud has enough computational capacity. Bandwidth and latency are not problems.
 
It is timeless only in simplest cases, where you have a static environment, and static environment conditions (wind direction, wind force, etc). If the use case is that simple, it's better to bake the vector field during production (potentially using cloud *) and store it to the disc.
Even if the wind shifts, it's doing so on its own terms and on its own time, unless player control of the weather is a game mechanic.
Would the player or other entities in the game world are care if eastward wind gust 75 is added to the client simulation 5 to 10 seconds late? Any specific time step in the remote simulation has a lot of leeway as to which local time steps it applies to.
 
Even if the wind shifts, it's doing so on its own terms and on its own time, unless player control of the weather is a game mechanic.
Would the player or other entities in the game world are care if eastward wind gust 75 is added to the client simulation 5 to 10 seconds late? Any specific time step in the remote simulation has a lot of leeway as to which local time steps it applies to.
Yes of course.

Even exceptional cases such as player blowing up a big building shouldn't cause much problems.I don't see noticeable visible problems even if the wind vector field update would take 10 second to update (as the affected area is filled with heavy rubble and dust flying around).
 
What does today's announcement that being online is no longer a requirement mean in relation to 'the cloud' improving gameplay?
 
Regards this topic, nothing, as this thread is supposed to be a platform agnostic look at what cloud computing can offer as a technology.
 
Exactly!

The biggest gain of cloud is dynamic load balancing. Previously developers had to rent/buy huge server farms dedicated for single games (or multiple games by a single publisher). This is expensive and inflexible.
.

no they didn't, if they did, they have been doing the internet wrong............

i find the whole this new cloud thing, funny as..... hummm what has akamai been doing for 15 years then............
 
If you're stepping on it and interacting directly, it becomes latency sensitive and no longer the periphery.

my thought on this is that can't it be faked? I mean the background stuff that should look more correct if you do computation on it but doesn't really impact the graphics and gameplay?
even the precomputing stuff usually can be baked in the game itself.
When I think about this whole cloud computing stuff, it ended up about managing player interaction (between the players and the world). Other uses that could potentially increase the graphic quality can be faked because it must be latency insensitive, thus not important, thus faking it.
What I don't want to see is someone making a sp games that relies heavily on the cloud that if you have not so good internet connection then the experience is broken... like sometimes the physics are good but other times are bad... or the graphics quality changes depending on your network connection... or brings lag like in mp games to sp games.

There are many ways to do it. I don't think we can be exhaustive here. I think the take home point is developers can use the cloud for latency insensitive work. Online games have been doing some tasks on the server for *years* now.

Whether it's used for the periphery may be a red herring. As long as you can hack the task in a latency insensitive way (e.g., precompute multiple possible solutions), you can use it for periphery stuff too (just pick one of the precomputed solutions as an approximation).

The real question is "Whether it's worth it" (i.e., at what cost).
And yes you can precompute some at compile time too, or just do a cheaper/free version locally.

Exactly!

The biggest gain of cloud is dynamic load balancing. Previously developers had to rent/buy huge server farms dedicated for single games (or multiple games by a single publisher). This is expensive and inflexible.

Recently the most discussed issue has been the server capacity problems during big game launches. We all remember Diablo and SimCity. Large scale cloud infrastructure could solve these kinds of problems cost effectively. Another problem that cloud could solve is the server maintenance cost of aging games. Active player count drops drastically a few months after a game release. Huge majority of players just play the game main campaign though once or twice (with extreme difficulty or different alignment/class), and then stop playing the game intensively. Depending on the game, around half a year (up to one year) after the game launch only around 10% of players are still actively playing it. And only the most hardcore players are still actively playing games that are 3+ years old. Cloud can automatically scale down the server capacity based on need, and this of course also scales down the maintenance cost. If you have only 10% players playing, cloud needs only to allocate 10% server capacity (10x reduced cost). This could likely mean that game developers / publishers keep their game servers running for much longer time, as the financial burden scales down as the game popularity decreases over time. It's a pure win-win situation.

Hopefully cloud allows more games to use dedicated servers, because developers don't need a huge up front cost to setup their own server farms. Maintenance cost scales nicely with the game popularity (as explained above). Cloud seems to be a perfect way to reduce dedicated server financial costs and risks. And this could be a big boon for console online gaming (compared to the currently most popular peer-to-peer, player hosted multiplayer lobbies).

Yes, cloud is invented to solve this problem. It's not common to gaming. Online stores also use cloud platforms to handle Black Friday and other holiday sales. Some also use it in the backend to analyze their customer data in near real-time.

no they didn't, if they did, they have been doing the internet wrong............

i find the whole this new cloud thing, funny as..... hummm what has akamai been doing for 15 years then............

Akamai has evolved over the years, together with cloud technologies.
 
Akamai has evolved over the years, together with cloud technologies.

Thats just hand waving for nothing other then incremental changes. The fundamental technologies that enables these solutions ( global traffic services/anycast routing/etc) have been around for a very long time and they haven't changed much.

Cloud is just the current flavor of the day.
 
Back
Top