Server based game augmentations. The transition to cloud. Really possible?

Except D3's economy actually requires other players to participate. Items in the auction house don't just populate themselves out of nowhere.



No, by my logic a game which has an economy (with human players, since that wasn't ovious to you) is not a single player game.

Furthermore, I never said D3 is an MMO, but rather, a lite MMO - meaning that it has elements of an MMO. Did you know D3 was originally developed to be an MMO?

P.S. D3 lets your friend drop into your game, just like borderlands.

I didn't know that since it wasn't designed as an MMO even in the pre-planning stages.

And sure D3 has "MMO" lite features just like Kingdoms of Amalur. Although Kingdoms of Amalur had far more "MMO" features than D3.

Either way whatever, this side discussion has gone on long enough. :) You obviously have your opinion even if it isn't shared by the vast majority of people who wouldn't consider D3 as anything but a single player game.

Regards,
SB
 
So essentially, still basically cheaper dedicated servers to all developers, for now. It's a good thing, quite useful. But not rainbows and magic.

The "full potential of the cloud" is just marketing speak for the realities of everyone having less expensive access to dedicated server farms.

Overall, dedicated servers for everyone is a good thing, but it's not the magic MS was touting. According to others here who know more, it definitely will be difficult to use for anything real-time.

But, since many people and I like the persistent elements of many games, it can only mean good things.
 
So essentially, still basically cheaper dedicated servers to all developers, for now. It's a good thing, quite useful. But not rainbows and magic.

The "full potential of the cloud" is just marketing speak for the realities of everyone having less expensive access to dedicated server farms.

Overall, dedicated servers for everyone is a good thing, but it's not the magic MS was touting. According to others here who know more, it definitely will be difficult to use for anything real-time.

But, since many people and I like the persistent elements of many games, it can only mean good things.

yea... no... much more than "dedicated servers"

Going to be some very cool things to come down the pike in a while...

Basically "Cloud deniers" will not admit it until it is up in their grill and even then they will just say... oh well could woulda, shoulda been done by anyone, nothing special. :LOL:
 
Yes, the power of the offline cloud!

Xbox 1, The Cloud Processing Machine with offline Cloud capabilities. :LOL:

It's a good idea for MS to leverage their already existing and growing Cloud infrastructure as selling point for the new console. It doesn't matter how the XB1 does MS is going to be doing as much business in the Cloud as they can. I wonder what kind of Cloud Power will be used for a potential Google gaming machine ? There is a company that is filthy with Cloudiness.

So whatever part of the rendering that gets "clouded" by definition is not terribly problematic from a system resources view of things ( stuff that doesn't change much and is normally rendered using some unused processor time ). When the XB1 or PS4 are scrambling to find cycles later on in the lifetime I can see where this will come in handy, but wouldn't the overhead of creating a reasonably attractive fail over system for the game, in case online resources are degraded or overused, add to the cost of development. Adding cost for something that isn't going to be a problem for a few years ? I wonder if developer resources for problematic online connections going to be proportional to the amount of "cloud transistors" used to render the game ?
 
Xbox 1, The Cloud Processing Machine with offline Cloud capabilities. :LOL:

It's a good idea for MS to leverage their already existing and growing Cloud infrastructure as selling point for the new console. It doesn't matter how the XB1 does MS is going to be doing as much business in the Cloud as they can. I wonder what kind of Cloud Power will be used for a potential Google gaming machine ? There is a company that is filthy with Cloudiness.

So whatever part of the rendering that gets "clouded" by definition is not terribly problematic from a system resources view of things ( stuff that doesn't change much and is normally rendered using some unused processor time ). When the XB1 or PS4 are scrambling to find cycles later on in the lifetime I can see where this will come in handy, but wouldn't the overhead of creating a reasonably attractive fail over system for the game, in case online resources are degraded or overused, add to the cost of development. Adding cost for something that isn't going to be a problem for a few years ? I wonder if developer resources for problematic online connections going to be proportional to the amount of "cloud transistors" used to render the game ?

They just need a system for cloud based resources to gracefully degrade depending on speed of connection or lack of connection. In the case of server side calculated global physics you just degrade into canned or very simplistic models. For NPC AI, you could either reduce the number of AI entities or go to simpler/dumber AI models.

Basically, when no online connection is available, it just comes down to either not doing it, using canned/prebaked, or greatly simplifying the caculations that are done since it now has to be done locally (although in this case it'd be quite likely to directly impact game performance negatively).

Regards,
SB
 
They just need a system for cloud based resources to gracefully degrade depending on speed of connection or lack of connection.
I think that 'just' is a gross simplification. ;) You're talking about extra work, to develop two solutions to the same problem. If the cloud means you don't need any light baking, to then also pay artists to create baked assets for when the cloud (that you're paying for to render the lighting) fails is just adding cost for something you hope gamers won't ever even need.
 
The kind of stuff you can do with cloud computing and limited bandwidth should scale really easily. You just have some general instance of whatever it is you'll be doing remotely, on the disc. You're left a bit less dynamic/realistic when disconnected.
 
I think that 'just' is a gross simplification. ;) You're talking about extra work, to develop two solutions to the same problem. If the cloud means you don't need any light baking, to then also pay artists to create baked assets for when the cloud (that you're paying for to render the lighting) fails is just adding cost for something you hope gamers won't ever even need.

The diagnostics tooling in VS2013 as well as the advances in the WinRT stack in 8.1 pretty much make it easy to build apps/games that can target cloud services when the users system has the resources or gracefully revert to 100% local services when resources don't allow it..

I don't know what the fuss is .. I'm a developer and I can easily see the advantages of offloading computation to the cloud, and it's totally up to me to make that call.

If the user is given a great experience I'm all for that.. They should never have to worry about it, only that the APP/Game works and looks great.

MS are making it very easy for me to build this !! I'm grateful for that!!
 
The diagnostics tooling in VS2013 as well as the advances in the WinRT stack in 8.1 pretty much make it easy to build apps/games that can target cloud services when the users system has the resources or gracefully revert to 100% local services when resources don't allow it..

I don't know what the fuss is .. I'm a developer and I can easily see the advantages of offloading computation to the cloud, and it's totally up to me to make that call.

If the user is given a great experience I'm all for that.. They should never have to worry about it, only that the APP/Game works and looks great.

MS are making it very easy for me to build this !! I'm grateful for that!!

Pardon me for asking but what kind of services and what kind of response times are we dealing with here. I am not talking about classical server side gaming stuff here ( match making, bots etc. ) but rendering parts of an image that needs to updated 16 1/2 milliseconds at a time ( assuming 60 fps is the NEW 30 fps :devilish: ). Is that level of "fail over" baked into 8.1 ? Again I understand that MS is going to help out here to one extent or another but I "feel" that there is going to be a bit of overhead when it comes to rendering scenes and gracefully dealing with network issues.
 
Pardon me for asking but what kind of services and what kind of response times are we dealing with here. I am not talking about classical server side gaming stuff here ( match making, bots etc. ) but rendering parts of an image that needs to updated 16 1/2 milliseconds at a time ( assuming 60 fps is the NEW 30 fps :devilish: ). Is that level of "fail over" baked into 8.1 ? Again I understand that MS is going to help out here to one extent or another but I "feel" that there is going to be a bit of overhead when it comes to rendering scenes and gracefully dealing with network issues.

No rendering. Physics and lighting iirc. This very thread contains all of the references, interviews, statements about which kind of compute loads make sense to offload. No point in trying to paint an inaccurate picture of the capabilities MS and the developers have clearly stated were not only possible but likely.
 
Last edited by a moderator:
No rendering. Physics and lighting iirc. This very thread contains all of the references, interviews, statements about which kind of compute loads make sense to offload. No point in trying to paint an inaccurate picture of the capabilities MS and the developers have clearly stated were not only possible but likely.

Well I was just asking what services liquidboy was talking about. Maybe it's something new or not yet discussed. So back to the stuff that doesn't take much time to do or can be pre-baked and placed on the hard drive. MP makes lots of sense and it will be nice to actually get some dedicated servers for the Gold price instead of P2P gaming ( still have to pay twice for Netflix ;) ) Single player games ... I'd like to see how many CPU cycles are saved on that kind of thing. Like I said years down the road when games on these machines are fighting for every cycle it's probably a no brainer but right now ... Light maps for large destructible gaming areas maybe or so a Respawn guy said. Sounds good but then again how little do we think devs can do with this new hardware ?? I would rather they invest their time making use of HSA in both systems right now and less time on non-interacting physics stuff or lightmaps generated in Der Cloud.
 
Pardon me for asking but what kind of services and what kind of response times are we dealing with here. I am not talking about classical server side gaming stuff here ( match making, bots etc. ) but rendering parts of an image that needs to updated 16 1/2 milliseconds at a time ( assuming 60 fps is the NEW 30 fps :devilish: ). Is that level of "fail over" baked into 8.1 ? Again I understand that MS is going to help out here to one extent or another but I "feel" that there is going to be a bit of overhead when it comes to rendering scenes and gracefully dealing with network issues.

As BlakJedi has already mentioned , numerous folks in this thread have already discussed the various scenarious that would qualify to be offloaded to the cloud..

I will add this thou..

DX 11.2 & WinRT api's in Windows 8.1 give us a greater arsenal of api's to control workloads based on latency (1frame vs several frame workloads etc)

Dx11.2 apis http://msdn.microsoft.com/en-us/library/windows/apps/bg182880.aspx eg. Direct3D low-latency presentation API, DXGI Trim API and map default buffer, Multithreading with SurfaceImageSource etc.

And the \\build\ sessions from Channel9 particularly around Dx go into more detail around Dx11.2/WinRT vNext in 8.1. Many discussions around Latency workloads and the tools for devs to use to make the right decisions on where we can offload our work to gain the best experiences. http://channel9.msdn.com/Events/Build/2013?sort=sequential&direction=desc&term=&t=directx

\\Build\ was an eye opener for me, I am a XAML/Dx/Web developer and what I got from the conference is that MS is giving developers more tooling to determine the cost of the code they write, CPU|MEMORY|Power etc. They are giving us the tools to determine what our code will cost latency wise if we leave it running locally, or if we move it to remote processing. They have made it very easy for us developers to see this via VS2013..
 
As BlakJedi has already mentioned , numerous folks in this thread have already discussed the various scenarious that would qualify to be offloaded to the cloud..

I will add this thou..

DX 11.2 & WinRT api's in Windows 8.1 give us a greater arsenal of api's to control workloads based on latency (1frame vs several frame workloads etc)

Dx11.2 apis http://msdn.microsoft.com/en-us/library/windows/apps/bg182880.aspx eg. Direct3D low-latency presentation API, DXGI Trim API and map default buffer, Multithreading with SurfaceImageSource etc.

And the \\build\ sessions from Channel9 particularly around Dx go into more detail around Dx11.2/WinRT vNext in 8.1. Many discussions around Latency workloads and the tools for devs to use to make the right decisions on where we can offload our work to gain the best experiences. http://channel9.msdn.com/Events/Build/2013?sort=sequential&direction=desc&term=&t=directx

\\Build\ was an eye opener for me, I am a XAML/Dx/Web developer and what I got from the conference is that MS is giving developers more tooling to determine the cost of the code they write, CPU|MEMORY|Power etc. They are giving us the tools to determine what our code will cost latency wise if we leave it running locally, or if we move it to remote processing. They have made it very easy for us developers to see this via VS2013..

Great stuff ! Thanks for answer.
 
The diagnostics tooling in VS2013 as well as the advances in the WinRT stack in 8.1 pretty much make it easy to build apps/games that can target cloud services when the users system has the resources or gracefully revert to 100% local services when resources don't allow it..
Take Elder Scrolls for example. When targeting the cloud you move the environment and people simulation there and do fancy stuff, unattainable on local console. That takes your devs xxx time to make, at xxx cost. If you then want the game to run offline, you have to design an offline world simulation system at additional cost and time.

The only way the latter doesn't add to the cost is if your system is completely scalable. That'd mean either the same world simulation code that runs on the local machine and in the cloud when available at enhanced quality, or you could have workloads moved to the cloud that free up local resources that can be applied to more scalable aspects like image quality. Neither of those really sounds like moving technology forwards to me though. For the cloud to be an advance, it has to be doing something not possible on the local machine, and that means a second set of algorithms for cloud on top of local computing.
 
I have a feeling that a lot of AAA devs will be struggling to learn how to code exactly that kind of scalability for games next gen, regardless of whether they're using cloud or not. A lot of publishers will want to update their engines to better suit that overall.

I see a lot of long nights for engineers ahead.
 
For the cloud to be an advance, it has to be doing something not possible on the local machine, and that means a second set of algorithms for cloud on top of local computing.

Not necessarily. Look at this (old) article:
http://www.asmcommunity.net/forums/topic/game-ai-neural-networks-how-to/

Locally you could run a (prelearned) network with only your own gameplay as input. On a cloud (with the same logic) you could use the input of all players. It is only a matter of the data-source.
 
If your goal was unbeatable AI that would make a lot of sense. If your goal is fun AI that probably isn't a silver bullet.
 
The author is wrong if he thinks bullets will stop warping around corners ... ever since they stopped making games purely server authoritative with a simple single causal timeline bullets have been warping around corners. All the developers want to accommodate the HPBs, through client authoritative hit detection and server side predating, instead of the LPB master race.
 
Back
Top