Server based game augmentations. The transition to cloud. Really possible?

What it says is that Cloud services are limited to Gold users, because it's essentially multiplayer games that takes advantage of the cloud. Which makes me wonder, was there any word on how Forza 5 would work without XBOX Gold? They just disable the cloud services/features for gamers without Gold?

I'm not sure that a "drivatar" system actually requires cloud compute:
- a console measures a player's driving on a test track, analyses the driving and uploads the resulting "AI profile" to cloud storage.
- other players download that profile, use it, and upload the result.
- the original console downloads the owner's profile/results, updates the profile and re-uploads it.

I don't think there's any requirement for cloud compute for Forza, just cloud storage (and cloud storage is free).
 
I'm not sure that a "drivatar" system actually requires cloud compute:
- a console measures a player's driving on a test track, analyses the driving and uploads the resulting "AI profile" to cloud storage.
- other players download that profile, use it, and upload the result.
- the original console downloads the owner's profile/results, updates the profile and re-uploads it.

I don't think there's any requirement for cloud compute for Forza, just cloud storage (and cloud storage is free).

Seems pretty obvious to me. Replace "a console" and the computing resources that would be required on that device with "cloud compute" and cloud resources.
 
I seen this nice video from Unite about Cluster Rendering.


Pete Moss, Unity's Field Engineer in simulation and visualization, about cluster rendering. Cluster rendering is a way to sync multiple players on a network. Networking is using a client/server architecture and leverages work on open source code.


I know there has been 2 or 3 Racing games on Xbox 360 & PS3 that let you hook up multiple consoles to run your game on a multiple monitors this generation but I hope it's used more Next Gen & maybe even let a PC or extra hardware help with rendering.
 
Last edited by a moderator:
I know there has been 2 or 3 Racing games on Xbox 360 & PS3 that let you hook up multiple consoles to run your game on a multiple monitors this generation but I hope it's used more Next Gen & maybe even let a PC or extra hardware help with rendering.

Really? I thought there was only that bespoke Sony demo for GT5 which never shipped to consumers.
 
How can an upscale algorithm perform AA?

I only can think of ppAA as an afterstep to upscaling. On the other hand, I can remember that some PS3 games with low transparency resolution buffer used MSAA to smoothen them out a bit (KZ2 iirc).

My question is: is it smarter to a) upscale+AA or b) AA + upscaling afterwards?

Would it have an effect and be smart if one would use AA+upscale+AA?

Say, use MSAA on the low resolution for max quality (and cheaper due to lower amount if pixel) -> upscale -> ppAA at the end for maximum de-jaggied end result.
 
Really? I thought there was only that bespoke Sony demo for GT5 which never shipped to consumers.

No it's for all GT5 owners just take a look in the menu.

GNl0Y.jpg




 
I thought that originally, but if you read the article, the 'analysis' task is actually rather trivial.

http://research.microsoft.com/en-us/projects/drivatar/forza.aspx

They certainly *could* use cloud compute, but it doesn't seem to be necessary.

The analysis is anything but trivial. If I recall they are using forest of trees analysis, which is a form of machine learning. Those are quite computationally demanding on the back-end. However the output could be very lightweight AI algorithm for each drivatar. The great thing about a forest approach is that you can have different AI for each car and race mode for every drivatar if you had enough data.
 
Really? I thought there was only that bespoke Sony demo for GT5 which never shipped to consumers.

This feature was even in GT3 already. Used the system's firewire port back then for networking. That was pretty far ahead for its time back then I think. I actually think GT5 didn't have it included day one, but it was patched back in later. Forza started supporting it in the last release too I think.
 
No it's for all GT5 owners just take a look in the menu.

This feature was even in GT3 already. Used the system's firewire port back then for networking. That was pretty far ahead for its time back then I think. I actually think GT5 didn't have it included day one, but it was patched back in later. Forza started supporting it in the last release too I think.

RAGE! How did I miss this? Hmmm whether to cancel my PS4 preorder and just buy a few PS3s + tvs, decisions, decisions.....:D
 
Moving locally hosted multiplayer to servers, whether distributed across servers in a cloud configuration or with dedicated boxes per game, is online hosting and not cloud computing augmenting consoles.

Feel free to start a discussion on what dedicated servers and online multiplayer advantages/changes there are. I consider that a discrete topic. Everything you have talked about can be achieved with static online servers, so isn't unique to cloud computing. Cloud computing can offer a cost advantage for such online servers, but that's a top of server economics and not cloud compute.

Cloud computing isn't limited to single player games. You could have, for example, a server computing GI lighting for a dynamically deformed world, and sending that lighting info to each player. However, if your doing that server side, one may as well shift the whole game computation to the server, at which point it become server-based gaming and not cloud computation. ;)

If the game industry really want to tackle CPU bound issues using remote servers, a cheaper and more pervasive solution is to tap on our own PCs and Macs on the same LAN. Plus I reckon the performance may be better than a remote solution, given the same CPU spec. So they don't have to shoot for high end CPU.
 
Interesting proposition, although the variety of local compute resources, plus the fact someone else might be using the PC, probably makes that infeasible. Remote computing on servers works precisely because you know it's there and available.
 
If they make an effort to market and plan for it, it should be ok. The users need to be aware of the dependency naturally. Security may be a bigger concern though. Start with something small.

Remote servers need to be planned out too. Initially, it may not be available for all countries. Scalability will also be an issue.
 
If they make an effort to market and plan for it, it should be ok. The users need to be aware of the dependency naturally. Security may be a bigger concern though. Start with something small.

Remote servers need to be planned out too. Initially, it may not be available for all countries. Scalability will also be an issue.

More than ever in this era the second system in a console owning home is likely to be running a really old OS (XP or Vista) and be pretty modest in resources. General consumer interest in PCs has fallen off a cliff when simpler tablets or phones give them what they want with less overhead (generally internet + social media). For all of the clouds challenges with latency they pale in comparison with the issues on most home LANs.
 
What about a Mac ? Don't PC and Mac vendors want to increase their sales , 'specially gaming PCs or even Steamboxes or daughter cards or NAS boxes ?

If the developers are going to use a remote compute server, they will already need to handle network and server failures (or absence in some countries).

It may not be straightforward but it sounds like we will have more and more compute devices all over the place moving forward.
 
What about a Mac ? Don't PC and Mac vendors want to increase their sales , 'specially gaming PCs or even Steamboxes or daughter cards or NAS boxes ?

If the developers are going to use a remote compute server, they will already need to handle network and server failures (or absence in some countries).

It may not be straightforward but it sounds like we will have more and more compute devices all over the place moving forward.


Oh they want to increase their sales alright but the dark truth of why so many PCs suck at gaming is that they were bought for the internet and because you have to put iTunes on something. That's why when you go to a relatives and are horrified by the crappy performance and 600 search bars they give that baffled look that says 'I thought it was meant to be this way'. Most consumers have come to associate PCs with hassle and slowness rather than as the awesome flexible computing beasts they are. When the relatively crappy ARM platforms came along with their clean UIs and ease of use these consumers fled in droves, just look at year on year consumer notebook and desktop numbers.
 
If the game industry really want to tackle CPU bound issues using remote servers, a cheaper and more pervasive solution is to tap on our own PCs and Macs on the same LAN. Plus I reckon the performance may be better than a remote solution, given the same CPU spec. So they don't have to shoot for high end CPU.

I dunno, people are scared to even use Steam on pc for games, something which is relatively simple and automatic to do. Do you really think they would want to deal with installing software to have lan compute support? I just find that very hard to believe. Further, do you think companies would really want to support that end user nightmare? It's a far better idea to just deal with a single cloud vendor on that, keeps the variables down and support far simpler.


What about a Mac ?

The analytics of my websites tells me that while iOS keeps increasing, OSx is relatively dead at a mere 6%. I don't think developers would bother supporting Macs with such low numbers.
 
I dunno, people are scared to even use Steam on pc for games, something which is relatively simple and automatic to do. Do you really think they would want to deal with installing software to have lan compute support? I just find that very hard to believe. Further, do you think companies would really want to support that end user nightmare? It's a far better idea to just deal with a single cloud vendor on that, keeps the variables down and support far simpler.

Yeah, given the range of computers out there, the number of family PC's that are clogged up with CPU hogging crap, and the fact that someone in the house could finish looking for shit on ebay then turn the PC off, it would quickly turn into a nightmare of epic proportions.

Best way to make use of PC compute is probably just to game on PC ...

The analytics of my websites tells me that while iOS keeps increasing, OSx is relatively dead at a mere 6%. I don't think developers would bother supporting Macs with such low numbers.

Yeah but your figures may not be representative. You deal in soft porn right? Mac owners are more likely to be masturbating over pictures of themselves taken at Starbucks ... [rimshot]
 
Back
Top