Server based game augmentations. The transition to cloud. Really possible?

No because to send data from machine A to machine B and get a response you endup with 2x the number of machine to backbone hops and that is where almost all the latency is. Try doing a traceroute to google.com, almost all the latency is between your house and the ISP.
Connections on the backbone are by comparison very fast.

Also in a 1v1 game you'd want to provide a consistent experience for both players and if one machine is acting as a "server" that player would have less latency and most likely an advantage.

Yap, it will be interesting to see how pricing change with dedicated server gaming as opposed to P2P gaming this gen. It will also be interesting to see how Sony react to dedicated server gaming this time.
 
To be honest I think this is about the gulf between what it might do (latency insensitive tasks) and what the reality of our modern internet infrastructure. Even the most late latency insensitive task will need to resync with the console at some point and if that data is of any significant size then b/w is a concern.

The LZ stuff was a point I hadn't considered as most game data is presumably relatively incompressible unlike audio/visual data which is a problem given the asynchronous nature of consumer net connections. Of course if you had a complete copy of the game in the cloud too then surely you could just pass references to certain objects to cut that down somewhat (i.e. Console->Server blow up obj_a, Server->Console send fancy_explodey_mesh).

I want demos of this stuff, to get away from MS vs Sony does anyone know of other research or demos of this tech before? Or in a related real-time distributed system over the internet (afaik most use bespoke networks or private LANs)?
 
Yap, it will be interesting to see how pricing change with dedicated server gaming as opposed to P2P gaming this gen. It will also be interesting to see how Sony react to dedicated server gaming this time.

If Sony doesn't move to dedicated servers, I can imagine them losing potential customers. I have been waiting for dedicated servers for multiplayer since Tribes on PS2, and I ended up moving to Xbox for the more fleshed out online. If Sony only matches the 360 online play experience I will probably get the X1.
 
If Sony doesn't move to dedicated servers, I can imagine them losing potential customers. I have been waiting for dedicated servers for multiplayer since Tribes on PS2, and I ended up moving to Xbox for the more fleshed out online. If Sony only matches the 360 online play experience I will probably get the X1.


yea dedicated servers for every game is a pretty sweet deal in all this
 
I can see one potential problem with cloud even if it works under ideal circumstances.
Misleading marketing. Let's say a dev really knocks it out the park and really develops a game that makes extensive use of the cloud, but the results you see are under the absolutely best conditions and do not represent what the average or even most gamers will experience. Could this type of demo be the new bullshot?
Anyway probably not the correct place to discuss that aspect. Not sure where it would fit though.
 
If Sony doesn't move to dedicated servers, I can imagine them losing potential customers. I have been waiting for dedicated servers for multiplayer since Tribes on PS2, and I ended up moving to Xbox for the more fleshed out online. If Sony only matches the 360 online play experience I will probably get the X1.

Well, they can have their own dedicated servers for first party titles and just hope that 3rd parties will seek out hosting their own in an effort to keep parity across platforms.
 
I can see one potential problem with cloud even if it works under ideal circumstances.
Misleading marketing. Let's say a dev really knocks it out the park and really develops a game that makes extensive use of the cloud, but the results you see are under the absolutely best conditions and do not represent what the average or even most gamers will experience. Could this type of demo be the new bullshot?
Anyway probably not the correct place to discuss that aspect. Not sure where it would fit though.

I'd presume devs would only want to build thorough cloud-based experiences around a guaranteed connection speed and then assume nothing lower can work. That's if you are referring to bandwidth considerations. There will surely be marketing hyperbole, just like for every game ever made ever.
 
I've got a friend at EA who reports that the servers they have to support 360 games have to be running on Windows, either to get access to an SDK put out by Microsoft or as a matter of licensing.

That gives them a strong incentive to do the same for their servers that are supporting PS3 games, to avoid maintaining two disparate operating system clusters and etc.

It'll be interesting to see how multiplatform games get supported if the XBox One games are to some extent obligated to depend on Azure.
 
I've got a friend at EA who reports that the servers they have to support 360 games have to be running on Windows, either to get access to an SDK put out by Microsoft or as a matter of licensing.

That gives them a strong incentive to do the same for their servers that are supporting PS3 games, to avoid maintaining two disparate operating system clusters and etc.

It'll be interesting to see how multiplatform games get supported if the XBox One games are to some extent obligated to depend on Azure.

Would be interesting if Microsoft ends up licensing Azure to Sony
 
I use my Ps3 almost to the exclusion of my 360. I plan on buying an Xbox ONE because I liked the multimedia aspects of the presentation. I'm also highly intrigued by the implementation of VMS That said I dont believe in the cloud power aspect. Not in the least bit...

IMHO they should have waited until E3 to present us with a game that showcases the cloud power working and had a side by side reveal showing the same game without cloud power. *shrug*
 
Yes, baseless claim. You try to rip on one of the more well connected journalistic sources as 'missing the point' and insinuate they did a shoddy job.

You continue to argue across multiple threads that everybody else is a some sort of MS hater and just trying to make MS look bad, while pushing ridiculous theories about secret sauce.

Yeah, Digital Foundry is biased against MS, as is all of B3D. You're just the lone white knight, sticking up for MS. Without actual facts, btw.

The fact of the matter is, their article was well thought out and fair based on the hand-wavium MS has promised with regards to this nebulous 'cloud service' that will somehow make their console 'more powerful' than the sum of its parts.

Point out, using factual statements backed up by legitimate sources, where DF's article has been unfair/biased towards MS.

Or, people dont like this tidy narrative that PS4 is gonna pwn the Xbox cause it's so much more powerful being suddenly threatened by something they never expected (and for the record I'm not sold on cloud, but unlike the rest I kinda doubt MS is just lying and being evil to Sony's white knight like everybody wants to portray)?

I agree they need to show something. Evidently Forza 5 is already being touted as using cloud, so there's a concrete start. http://news.softpedia.com/news/Forz...and-Its-quot-Infinite-Power-quot-356329.shtml

Forza Motorsport 5, the next installment in the racing series and one of the launch day games for the Xbox One, will take advantage of the "infinite power" of the next-gen console's cloud system to deliver a bold experience.

I doubt MS doesn't have fairly concrete plans for how to implement this tech already.
 
The more I read of DF's article the more it seems misguided. They focus on bandwidth an awful lot early in the article as if MS is trying to replace internal buses with your internet connection. That's incredibly misleading, especially to make such suggestions at the forefront of an article like that.

Takes them forever to even get to stuff MS *actually* talked about. It's also odd they try to suggest X1's hardware wasn't designed for the cloud just because it "only" has LZ and jpeg compress/decompress in the DME's. Err...what more should they be expected to have in hardware for a software-based concept where the entire point is to send hardware-based computations outside the box? Hardware to help sending data out and in and...what else is needed?

Probably half of the article is written in a dismissive tone in the context of discussing the limitations of cloud streaming, which has little if anything to do with what MS is talking about. That's Sony/Gaikai's problem...not MS's. When they do talk about the claims MS actually made they frame them as if MS was "admitting" to something which is silly. When they talk about the specifics of latency insensitive cloud candidates they are also dismissive in tone and lump a bunch of pretty heavy computation tasks into vague categories and act like they are highly limited in nature without establishing why.

They also talk about the examples MS gave and then suggest MS is somehow light on specific ideas when those examples came during interviews with other publications. MS isn't going to give you exhaustive descriptions of every damn thing that is latency insensitive in a modern videogame. That's like accusing Bethesda of being 'light on ideas' simply because they don't talk up every single aspect of Skyrim in the very first interview on the game.

The most problematic part is that they seem to have missed the ENTIRE point, which is to use the cloud to free up local resources. DF doesn't just ignore this, they dismiss stuff like the lighting possibilities Matt Booty from MS mentioned simply because it could be done locally. Ummmm...that's the point! Moving stuff that is latency insensitive to the cloud to free up resources and get outta the way of the latency sensitive stuff. *sigh*

you do realize that DF actually says opposite. Its bandwidth less intensive to go Onlive/Gaikai method cause it is much easier/effective/efficient to compress image.

Raw data you cant. So it is precisely MS's problem.

And DF doesnt miss it. The only resource it cant free up is premade effects. Effects that can EASILY be done on loading screen. So besides shortening loading time. Wont be much use.
 
you do realize that DF actually says opposite. Its bandwidth less intensive to go Onlive/Gaikai method cause it is much easier/effective/efficient to compress image.

Raw data you cant. So it is precisely MS's problem.

And DF doesnt miss it. The only resource it cant free up is premade effects. Effects that can EASILY be done on loading screen. So besides shortening loading time. Wont be much use.

Raw data can be compressed I guess.
 
The more I read of DF's article the more it seems misguided. They focus on bandwidth an awful lot early in the article as if MS is trying to replace internal buses with your internet connection. That's incredibly misleading, especially to make such suggestions at the forefront of an article like that.

Takes them forever to even get to stuff MS *actually* talked about. It's also odd they try to suggest X1's hardware wasn't designed for the cloud just because it "only" has LZ and jpeg compress/decompress in the DME's. Err...what more should they be expected to have in hardware for a software-based concept where the entire point is to send hardware-based computations outside the box? Hardware to help sending data out and in and...what else is needed?

Probably half of the article is written in a dismissive tone in the context of discussing the limitations of cloud streaming, which has little if anything to do with what MS is talking about. That's Sony/Gaikai's problem...not MS's. When they do talk about the claims MS actually made they frame them as if MS was "admitting" to something which is silly. When they talk about the specifics of latency insensitive cloud candidates they are also dismissive in tone and lump a bunch of pretty heavy computation tasks into vague categories and act like they are highly limited in nature without establishing why.

They also talk about the examples MS gave and then suggest MS is somehow light on specific ideas when those examples came during interviews with other publications. MS isn't going to give you exhaustive descriptions of every damn thing that is latency insensitive in a modern videogame. That's like accusing Bethesda of being 'light on ideas' simply because they don't talk up every single aspect of Skyrim in the very first interview on the game.

The most problematic part is that they seem to have missed the ENTIRE point, which is to use the cloud to free up local resources. DF doesn't just ignore this, they dismiss stuff like the lighting possibilities Matt Booty from MS mentioned simply because it could be done locally. Ummmm...that's the point! Moving stuff that is latency insensitive to the cloud to free up resources and get outta the way of the latency sensitive stuff. *sigh*

Another thing. I suggest you re read that article. You are factually incorrect in how you describe DFs article. Not to mention heavily Xbox biased...
 
It will depend on the raw data.

Um... ofc. even images in form of a raw data which can tolerate losses in this case. And obviously cloud is generating textures, it can allow losses and in many applications loss is not important and some it does.

Your point? Not feeling like going over every case scenario I know. I like staying in context and simplifying. Regardless, Do you disagree that it wont be a big of an issue in Onlive and Gaikai?
 
you do realize that DF actually says opposite. Its bandwidth less intensive to go Onlive/Gaikai method cause it is much easier/effective/efficient to compress image.

Did you read my post for you on TXB on the issue? If not, I recommend it. I know what DF said about bandwidth, but as I pointed out the first several thousand words on the issue were totally misleading in the context of MS's plans as they dealt with concerns about game streaming (which is totally different than MS's plans). You can see it being posted on GAF already where everyone there totally misunderstands that DF actually concludes because they didn't read the whole thing. That's unfortunate on DF's part.

Raw data you cant. So it is precisely MS's problem.

The data being sent would already be extremely small in the first place most likely. We aren't talking about HD textures or huge batches of geometry most likely.

And DF doesnt miss it. The only resource it cant free up is premade effects. Effects that can EASILY be done on loading screen. So besides shortening loading time. Wont be much use.

This paragraph literally doesn't make sense. *ANY* resources that would normally have to be used for latency insensitive computations can be moved to the cloud (in theory) and out of the way for the local hardware to be vastly more efficient than it otherwise would have been with those operations clogging up the processing.

That is THE point with the cloud. That is the mechanism by which MS contends the local box becomes effectively more powerful over time as devs get more progressive in utilizing those remote resources. And DF didn't even really touch on that.
 
The data being sent would already be extremely small in the first place most likely. We aren't talking about HD textures or huge batches of geometry most likely.

What kind of data are we talking about, for what operations and whats the average size that it would send.
 
Back
Top