How effective would an Internet server be for a complex game?

Shifty Geezer

uber-Troll!
Moderator
Legend
The idea of offloading processor intensive tasks like physics to a central server, and then communicating the results to a device, sounds nice, but how effective could it really be? Say for example there's an FPS with 20 players, 100 AI bots and full physics running on the PSP with a server processing all the gameplay. Surely the BW needed for large scene updates is going to be above that currently available to most homes, and isn't lag going to mess things up? If someone's connection is 40 ms slower than another player's, won't they be getting the updates at a delay? And won't there anyway be a lag between nput and output as the actions of the user have to be uploaded to the server, processed, and downloaded?

How much real work can be taken away from the consoles and moved to the server without things getting messed up?
 
I can't see it working at all, to be honest. The latency would kill it for interactive purposes. If you shoot a barrel in a game similar to HL2, for instance, you'd have to wait for the roundtrip+calculation time before you saw it start to move... unless you had significant client-side prediction, which would make offloading the physics to a remote server completely pointless, since you're calculating them anyway...

It might work for slow-paced games (an MMORPG, for example, might work ok - they already do a fairly large amount of server-side processing anyway, to reduce the amount of cheating that's possible), or especially turn-based games, but anything real-time, and demanding fast reflexes just wouldn't work because of latency issues.
 
I think it'll be just more realistic as characters in a remote server won't look like superhuman with sub 1/60 second reaction often seen in FPS games and fighting games. You know a player character in Unreal Tournament skates in an arena and jumps around 10 times in succession without slowdown. But real human doesn't move like that.

It'll be like you asynchronously watch from your client a realtime movie happening in a remote server where all physics are resolved, occasionally clicking your controller to give characters your order. It may even become smoother experience as it requires no synchronization/synthesis of client-side processed results at the server side except for key inputs.
 
On topic: It might work in situations where the latency to the server is less than, shall we say 25ms? Twice that and you will notice that there is something wrong and that the world is lagging. This really limits the uses that server side offload would have, not to mention that it would be expensive to keep such servers since they would need alot of processing power. By the time this would be implemented in servers most people would be using dual core processors and probably wouldn't benefit from offloaded physics over the internet. As arhra stated it may work for slower paced games, but would it really be worth the effort ? And then the latencies across the Internet is unlikely to change since it's mostly run on fiber connections and there's currently no technology that can surpass the speed of light (that I know of.. I know some are doing experiments though).



Off topic:
one said:
I think it'll be just more realistic as characters in a remote server won't look like superhuman with sub 1/60 second reaction often seen in FPS games and fighting games. You know a player character in Unreal Tournament skates in an arena and jumps around 10 times in succession without slowdown. But real human doesn't move like that.

It'll be like you asynchronously watch from your client a realtime movie happening in a remote server where all physics are resolved, occasionally clicking your controller to give characters your order. It may even become smoother experience as it requires no synchronization/synthesis of client-side processed results at the server side except for key inputs.
You really need to get a look at good players. I've seen plenty of sick jumps in UT, and in the game I mainly play, Unreal 2 XMP, you are constantly jumping since you move alot faster with it. Here's a page with a couple of trick videos and the xmptv clips are from clan battles, but please don't abuse their bandwidth since the clips are quite large, >50MB.

OR you meant that current games are too fast and that people don't move that fast in real life. Well, then you've got plenty of games (even first person shooters) that are slow enough to let 70 year olds compete with teenagers. However being slow and slow and very slow there is alot of 'dead time' where there's nothing interesting happening and many people will find them somewhat boring since you may spend two minutes getting to the action part only to have it last 5 seconds (being realistic you often die from one or two shots) until you get shot and then you have to respawn and do it all over again. I personally prefer to have action almost nonstop and enjoy slugging it out with people every 30 seconds instead - with an amount of action that often equals, or atleast comes close to equal, any time you spend to get into the action. So then it's just a matter of personal preference. If a game seems to fast for you it's often just a matter of getting familiar with the speed, even though you may not have the twitch reflexes of teenagers thinking ahead often makes you equal to people that only rely on fast reflexes.
 
I think the solution is to make it mandatory to be drunk while playing. Latency problems don't matter so much when the player is 'lagging' as well!
 
One of the biggest problems in PC graphics and computation is latency and bandwidth right?

Now imagine that when your silicon is MILES away down a thin cable. See the problem? Like a five mile PCI-Express bus. That's a huge step backwards, not forwards.

I realize you're not talking about a GPU, but still..
 
I don't think the lag issue is so clear cut. All of the scenarios presented so far make the assumption that maximum processing is demanded at the onset of an "event" (thus the lag effect would make the use of networked resources impractical). This [the actual moment when great processing is demanded] may or may not be true. It's just as possible that the computational flare ensues as the event unfolds, rather than just at the initiation. Presumably, the initial event will be handled promptly by local resources. By that time, the network resources will be coming to bare to handle collateral and chained physics events, that come as a result of the initial event.

For example, maybe the exploding barrel is something that local resources could handle easily. It's the development of a realistic fireball, and then the expanding pressure wave affecting other nearby things or heat wave igniting other nearby things, that could be computationally accelerated by network resources. This "secondary" event is of a nature such that lag effects are maskable and manageable, while still running in credible manner. Essentially, who's gonna notice a few missing ms here and there while beholding a peripheral visual event? ;) In a similar manner, the use of local vs. networked resources may be determined by the nature of the event- smaller but very abrupt vs. much larger in scale but not rigidly time dependent. The smaller event can be calculated effectively on local resources, as long as it doesn't get "too intensive". The larger event is of a grandiose scale and requires a considerable computational load to accomodate properly, hence it is more suited for networked resources. Think of the intense demands of a global weather simulation. No way in hell a single processor core will handle that in a reasonable human lifetime. However, a cluster of networked resources will handle it nicely. Given the scale of the simulation, no one is going to care if it happens a few ms earlier or later. I'm not saying a fireball simulation is the same as a global weather simulation. However, the potential for scale is very similar.
 
What if the server calculated say 10 frames worth of physics results? Say it also calculated every possible result from every movement that can occur in these next 10 frames. This would increase the size the download but hide the latency. Things don't move that much in ten frames. Maybe the client only calculates the effects of changes to the simulation the server cranked out as apposed to the entire sim. Another way is the server could only calculate physics for things the player cannot affect in the next 10 frames. There are lots of things the player will never affect like the sky. A really complex set of data about the sky could be generated on the server to allow the client to render a more realistic sky with complex thermodynamics and weather patterns.

The problem is the average internet user’s connection barely keeps up with the requirements of today’s games, as things get better though this could be useful.
 
I can describe it in three words: large shared databases. If you have an MMORPG that simulates an entire world of AI, it is less efficient to have each client maintain a complete copy of the entire world, AND, run all the AI routines for the entire world. AI offloading is definately possible, not only because alot of it can run quasi-real-time (latency won't matter so much), but because it's actually more network efficient to centralize it.
 
randycat99 said:
Essentially, who's gonna notice a few missing ms here and there while beholding a peripheral visual event? ;) [...] Given the scale of the simulation, no one is going to care if it happens a few ms earlier or later.

You might as well just fake the whole thing, most people aren't going to notice or care about a global weather simulation, especially if it's not directly gameplay affecting.
 
You are taking that example too literally.

The strategy is you don't tie network resources to response-time type of events. What you can tie them to is large-scale/environmentally-affecting events that are more reliant on depiction realism, rather than response-time.
 
Last edited by a moderator:
DemoCoder said:
I can describe it in three words: large shared databases. If you have an MMORPG that simulates an entire world of AI, it is less efficient to have each client maintain a complete copy of the entire world, AND, run all the AI routines for the entire world. AI offloading is definately possible, not only because alot of it can run quasi-real-time (latency won't matter so much), but because it's actually more network efficient to centralize it.

I agree A.I. would be an even better candidate for server resources. Especially because really good A.I. requires huge data sets, NPCs need to remember stuff, know things, and have tons of traits to be believable. I can just imagine the complexity you could create with terabytes of storage, nice read times, and tons of memory.

The bad part about A.I. is that most devs don't even use a fraction of the capabilities currently available. A.I. can be the most time consuming and expensive part of game design. You would need to generate millions of lines of text and organize them. Text is just one of the time consuming things you would need to do. Yet if you could generate gigabytes of useful A.I. data it would likely provide a better user experience than what is available currently.
 
Last edited by a moderator:
randycat99 said:
You are taking that example too literally.

The strategy is you don't tie network resources to response-time type of events. What you can tie them to is large-scale/environmentally-affecting events that are more reliant on depiction realism, rather than response-time.

The problem is large scale *slow* event simulations (weather, seasons, terrain changes, etc) don't tend to need realistic or accurate simulation for a game; and large scale *fast* event simulations (explosions, large numbers of physics objects interacting, etc) will tend to produce so much data as to overwhelm your network connection anyway.

You might get away with it if you want to disassociate the effect completely from gameplay, in which case, you might as well just fake the whole thing (just play a canned animation of the walls collapsing or the shockwave or whatever on a distant building instead of trying to contact an internet server to run the simulation).

Of course this is just IMHO. Maybe someone out there will make it work for something.
 
Last edited by a moderator:
DemoCoder said:
I can describe it in three words: large shared databases. If you have an MMORPG that simulates an entire world of AI, it is less efficient to have each client maintain a complete copy of the entire world, AND, run all the AI routines for the entire world. AI offloading is definately possible, not only because alot of it can run quasi-real-time (latency won't matter so much), but because it's actually more network efficient to centralize it.

Agree. MMORPGs already do this today.
 
In the context of the discussion, I'm thinking more about a future of 'dumb terminal' devices that network. Imagine something like a mobile phone with negligable processing power, which is there to drive a display and IO. Then run something like HS at the army scene. All the game processing would need to be done by a server, without lag...

In randycat's example, I think lag would be noticeable. If you have lag of 50 ms which seems low for most internet connections) that's 3 frames between the barrel exploding and the surroundings getting moved. That might go unnoticed. If lag gets up to 100 ms, that's 6 frames, 1/10th of a second, and people will definitely see that the explosion effects come after the explosion.

Sony and IBM have talked of offloading intensive tasks to a central server, but I'm sceptical. Largescale gradual effects like weather and AI could be done, but the meat of the game, which both have been referring to, seems implausible to me.
 
aaaaa00 said:
You might get away with it if you want to disassociate the effect completely from gameplay, in which case, you might as well just fake the whole thing (just play a canned animation of the walls collapsing or the shockwave or whatever on a distant building instead of trying to contact an internet server to run the simulation).

Of course this is just IMHO. Maybe someone out there will make it work for something.

Yes, you could fake it, but if the wall collapses the exact same way everytime, then that is something somebody will complain about as a reason for real physics over canned animation. It's exactly the same reason people would complain about scripted AI or pre-baked lightmaps to sweeten the graphics. It makes it that much more realistic if the event is adaptive to the specific conditions which may be slightly (or a lot) different every time, rather than fixed to a specific initial conditions state. If the initial conditions state of the play environment happens to not match up with what they were when the canned event was baked (for some reason the barrel was pushed halfway across the screen before it actually got ignited, for instance), the canned animation will definitely stand out (since the result is clearly nonsensical considering the causation).

Additionally, doing a canned animation also assumes the animator has a good grasp of how something will appear if it really happened. Sometimes they don't (in fact, a lot of times they don't). Sometimes the event is so fictional and far-removed from real life experience, no one really could prepare a "realistic" animation of the event. That is when you resort to real physical simulations to get an idea. Short of that, making an animation event is by far the best guess. The end result may or may not look really flakey to an observer of the gameplay, depending on their personal experiences or grasp of physics/dynamic motions. Therein lies the great attraction of tying in some degree of actual numerical simulation to drive an animation event.
 
randycat99 said:
Additionally, doing a canned animation also assumes the animator has a good grasp of how something will appear if it really happened. Sometimes they don't (in fact, a lot of times they don't). Sometimes the event is so fictional and far-removed from real life experience, no one really could prepare a "realistic" animation of the event. That is when you resort to real physical simulations to get an idea. Short of that, making an animation event is by far the best guess.
You'd be better off running an offline simulator and record the object's motions through a series of keyframes.
 
What I find interesting, is you've got Evolution Studios putting Motorstorm Online (8 Players per tracks). This is a game that is supposed to b realy relying on physics for gameplay and pumping out some massive visuals. I wonder how well this will mesh with Online gameplay...
 
BlueTsunami said:
What I find interesting, is you've got Evolution Studios putting Motorstorm Online (8 Players per tracks). This is a game that is supposed to b realy relying on physics for gameplay and pumping out some massive visuals. I wonder how well this will mesh with Online gameplay...

All i can see from Motorstorm - the equivalent we will get on the PS3, regardless if it looks as good as the CGI demo - is lots of particles and smoke... Car physics will be good but hardly something i'd want an offline server to do instead of the PS3 i have at home...
 
Back
Top