Server based game augmentations. The transition to cloud. Really possible?

cost is exactly the reason very few developers will utilize 'the cloud' outside of multiplayer...even if you did find a good use for compute resources that are hundreds of ms away for most customers, you have to pay for those compute resources, and for as long as you want to maintain those resources for the people that bought your game


and what do you honestly gain?...somewhat dynamic, prebaked lightmaps is the only example? we've been playing multiplayer games on dedicated servers for decades, with and without AI, so drivatars isn't really new either...what are devs doing with 'the cloud'?
 
Can you elaborate?

I mean, if the host client cpu will be loaded, let say, with 5% more compute task due to multiplayer managment, I suppose that, in order to work, there must be a 5% "reservation" on the cpu for that game no matter if you are the host or not in that peculiar match.
Because the only alternative to this situation, is that the host will run a downgraded version of the game (with 5% less CPU power let's say). But this latter possibility seems quite odd to me.


and what do you honestly gain?...somewhat dynamic, prebaked lightmaps is the only example? we've been playing multiplayer games on dedicated servers for decades, with and without AI, so drivatars isn't really new either...what are devs doing with 'the cloud'?

I am not considering the extras (like AI, physics, or other things ... if they will be real). I am not interested in this aspect now.

I just would like to understand which is the impact on the CPU of a Titanfall-like client server based game.
In other word, which is the gain in CPU power of a game that use dedicated servers (even without considering the better performance in terms of gameplay)?
Which would be the impact in % on the CPU?
 
I just would like to understand which is the impact on the CPU of a Titanfall-like client server based game.
In other word, which is the gain in CPU power of a game that use dedicated servers (even without considering the better performance in terms of gameplay)?
Which would be the impact in % on the CPU?

Well, not all things are equal, meaning that the computation load of a p2p setup and the client-host setup are very likely to be distributed quite differently because of the different setup, even the total load will be different, because now the devs have a dedicated environment where they would most likely offload more computation onto the server (versus a p2p setup)

The other thing to look at is the bandwidth consumption, you just can't realistically host 64 players on your home connection and hope for good gameplay experience.
 
A small update tidbit via a tweet by Phil Spencer

jbz5lAD02ZLpDR.png


Not sure what to make of it, I dont necessarily assume "things" means games, personally.
 
New Gamesindustry article on X1 cloud:

http://www.gamesindustry.biz/articles/2013-10-15-the-difference-engine

Evidently will be free to developers.

Also they are now strictly talking about CPU boosting seems like.

Also, Mr Penello says on GAF:

Also to be clear. One of the benefits of publishing games on Xbox One – ALL game developers get Dedicated Servers, Cloud Processing, and “storage” (for save games) free.

If you want to do dedicated servers on other platforms, you have to prop them yourself. But on Xbox One, while developers can choose to use their own methods, we make it available to everyone.

There should be no confusion on this point. We do not charge developers for Dedicated Servers.

Pretty darn big deal imo.
 
And another one. i guess it's cloud day at Microsoft...

http://news.xbox.com/2013/10/xbox-one-cloud


Higher fidelity game experiences – As I mentioned before, cloud compute can enable developers to offload computations for all sorts of environmental elements. In a typical game development scenario, the game creator needs to balance resource allocation across each area – world management, rendering, controls, networking, lighting, physics, AI, as well as networking and multiplayer. Balancing the local computing resources for all of these elements often results in developers making tradeoffs that result in more focus on core gameplay, and less on environments, NPC and other elements of world fidelity. However, when cloud compute is available to support the various computationally-intensive elements of the game, these kinds of tradeoffs become much easier for developers to make. Games can afford to provide higher fidelity worlds and deeply intelligent NPC AI all at the same time. These experiences could only be accomplished by leveraging the resources of servers.

Improved multiplayer game experiences – This is perhaps the most obvious example of what is possible with Xbox Live Compute – dedicated servers! If you have played a lot of multiplayer games, you know that playing on dedicated game servers has advantages over peer-to-peer gameplay. With server-based multiplayer gaming, not only can more players play the game (think hundreds of players simultaneously), the gameplay will be much more reliable for the players. No more host migration interruptions, suboptimal experiences for the host, home network NAT constraints, or player cheating! Additionally, Xbox Live Compute can be utilized to persist game state so that your squad can live to fight another day without losing any progress. A great example of a game that is using Xbox Live Compute for their dedicated server multiplayer experience is “Titanfall.”

Adaptive & evolving game play – Imagine the game you play every day improving each time you log in. Imagine joining a match in your favorite first person shooter to find new maps and game modes even though you never downloaded a game update. Imagine playing with your friend even when he/she is not online. When games are powered by Xbox Live Compute, they can be dynamically updated, tuned, changed, and improved continuously. Games will evolve and live on for greater periods of time, continually providing fresh content and new experiences. The flagship example of this application of cloud computing can be found with “Forza Motorsports 5, “and its Drivatar system.

On-demand compute improves game availability – With all of the potentially interesting things that can be accomplished with Xbox Live Compute, one of the most important things is that the resources (e.g. servers) are available when gamers need it most. It is the geographic availability of this service, and its elastic scalability that enables gamers to connect to an available server and play without experiencing busy or unavailable servers. This ensures that games meet the changing demands of their player communities for compute, and gamers experience optimal connectivity based upon their geographic location. Additionally, it means that game creators can be assured that the server capacity they need, in the appropriate geographies, will be there when they need it.

A Youtube video too

 
I think the cloud compute message still need some improvement. I can see how the dedicated servers would be super nice, background content push is also nice, not sure what this "cloud compute" does, sounds just like on-demand dedicated servers.
 
I think the cloud compute message still need some improvement. I can see how the dedicated servers would be super nice, background content push is also nice, not sure what this "cloud compute" does, sounds just like on-demand dedicated servers.

I think it's obviously an on the fly work in progress, actual processing of game stuff in the cloud.

But it has interesting potential at the least. Seems it will be limited to CPU stuff for now, but that's OK.
 
I think the cloud compute message still need some improvement. I can see how the dedicated servers would be super nice, background content push is also nice, not sure what this "cloud compute" does, sounds just like on-demand dedicated servers.

it can also offload compute tasks which could result is significant graphical improvements by removing bottlenecks.
 
it can also offload compute tasks which could result is significant graphical improvements by removing bottlenecks.

I have a hard time imaging how this would work, either it's gonna have high latency and/or high bandwidth. Maybe you can bake some environment maps in the cloud? But what would be the benefit when this is something that you can do offline in the asset pipeline.
 
I have a hard time imaging how this would work, either it's gonna have high latency and/or high bandwidth. Maybe you can bake some environment maps in the cloud? But what would be the benefit when this is something that you can do offline in the asset pipeline.


Is it really that hard to imagine. Let's say a game is significantly compute bound, ooo like say the BF series. Of the compute tasks required for the game 15% can tolerate some latency. That 15% can happen in the cloud. The 15% decrease in compute may lead to 5 to 30% improvement a graphical performance depending on how bound the game was.
 
Is it really that hard to imagine. Let's say a game is significantly compute bound, ooo like say the BF series. Of the compute tasks required for the game 15% can tolerate some latency. That 15% can happen in the cloud. The 15% decrease in compute may lead to 5 to 30% improvement a graphical performance depending on how bound the game was.

Sure...but how is this different than say, any kind of dedicated server in the past...?
 
Is it really that hard to imagine. Let's say a game is significantly compute bound, ooo like say the BF series. Of the compute tasks required for the game 15% can tolerate some latency. That 15% can happen in the cloud. The 15% decrease in compute may lead to 5 to 30% improvement a graphical performance depending on how bound the game was.

Sounds like science fiction.
 
Sure...but how is this different than say, any kind of dedicated server in the past...?

Because it's offered as part of the platform. Developers can use it without having to worry about maintaining a dedicated server forever.
 
Sure...but how is this different than say, any kind of dedicated server in the past...?

It's different because he is not talking about holding a multiplayer match but the server doing compute for tasks that are not latency sensitive.

But I'm not a believer of that tech in that regard (for realtime graphics applications, I should emphasize), because of reasons mentioned earlier in this thread but to sum up, they need a backup method for cases when you can't reach the server at all or on time, they need two paths for everything handled in the cloud, so the game doesn't break. This creates a lot of extra work, and someone having to pay for what could otherwise be so conveniently computed on client side, for single player affair, doesn't make sense to me, and some others on this forum.

Also, since you need backup methods, things you offload to cloud could not have gameplay implications, so when you don't have the cloud at your disposal on time, the game should not play differently. This reduced the value for me but other may say any extra detail can help with the immersion and not everything that is beneficial needs to effect gameplay and that's a good enough reason (while a valid point, still doesn't explain why someone would need to pay for that compute)
 
Last edited by a moderator:
Sounds like science fiction.

What are you talking about. All of those statements are completely reasonable and underselling the proposition if anything. Given the relative weakness of the CPUs do you not believe that many games could be CPU bound?
 
What are you talking about. All of those statements are completely reasonable and underselling the proposition if anything. Given the relative weakness of the CPUs do you not believe that many games could be CPU bound?

Its perfectly reasonable that games could be be CPU bound, im just not 100% convinced that having 3x the power in the cloud, with the relative latency and bandwidth restraints (high and low respectively) that its a good model for a large number of problems.
 
Back
Top