Remote game services (OnLive, Gaikai, etc.)

No one has pulled it off because it’s an immensely difficult technical and practical execution challenge, involving deep knowledge and experience across many fields. To make OnLive work involved fundamental work in psychophysical science; custom chip, hardware and wireless engineering; complex real-time software — from the lowest- to highest-level, and real-time network engineering down to the sub-packet level. And, it required a deep understanding of business structure in the video game, Internet, hosting, server and consumer electronic industries. Then, finally, it required an enormous amount of just practical execution: testing the system in hundreds of homes and ironing out every wrinkle to make it operate seamlessly.

And, it’s no accident that OnLive works so well. It took many years of development, testing, and refinement to get it to work through the vast range of Internet hookups in the home, and there are a huge number of particular mechanisms we’ve had to build into the OnLive system to overcome each particular issue we’ve run into. OnLive has been tested in hundreds of homes through the US, through a wide range DSL, cable modem and fiber connections, and through any manner of consumer firewalls, routers, switches.

Emphasis mine.

From this interview. A few pages of softball questions, with some interesting things. Like, when he says 'fundamental work in psychophysical science' -- what does this mean? Is this their way to get around the otherwise insurmountable latency problem? He goes on to say that this will be a platform for the price-conscious AND the high-end gamers.
 
Interested to see if they have any kind of E3 showing, would make sense right, seeing how it is supposed to have the public beta test this summer and released this Winter.
 
Even with the best h264 encoder around, 5mbps of 720p at 60fps is going to macroblock like hell. I notice that Burnout Paradise is on their video as one of the games you can rent. I've done a lot of encoding with that - it's one of my stress test games for h264/WMV encoding. I've NEVER been able to make it look good at anything less than 12mbps and even then it was a bit ropey in places.

Speaking of Burnout Paradise, I'd pay serious money to be able to play that over this service on a typical ADSL connection.

The latency issue is insurmountable no matter how much 'psychophysical' science you come up with.

Maybe, just maybe, this would work if it were directly ISP-side, similar to IPTV. But it's not doing that, is it?
 
The latency issue is insurmountable no matter how much 'psychophysical' science you come up with.

I just want to know what it means. Maybe the game plays itself -- if the user is about to get splattered because he's operating on data from 80ms ago, well, give the benefit of the doubt. :D "Whoa, how'd I dodge that truck?! I'm awesome."
 
Even with the best h264 encoder around, 5mbps of 720p at 60fps is going to macroblock like hell. I notice that Burnout Paradise is on their video as one of the games you can rent. I've done a lot of encoding with that - it's one of my stress test games for h264/WMV encoding. I've NEVER been able to make it look good at anything less than 12mbps and even then it was a bit ropey in places.

Speaking of Burnout Paradise, I'd pay serious money to be able to play that over this service on a typical ADSL connection.

The latency issue is insurmountable no matter how much 'psychophysical' science you come up with.

Maybe, just maybe, this would work if it were directly ISP-side, similar to IPTV. But it's not doing that, is it?


Does it say 60fps anywhere? I kind of assumed 30fps.

ISP side was just my suggestion as to how it could work better, but I don't think they've released any details on where the servers are located.
 
Standardising at 30fps for the video stream would make a lot of sense, but I'm not sure why running the game at 60fps server-side would be beneficial if that was the case.
 
The more I think about this, the more I can't understand how it would work without aggressive QoS policies and ISP participation. Do you think ISPs would want to get into the video game business? Comcast and Verizon might have the bandwidth for something like this, though it wouldn't be cheap.
 
Does it say 60fps anywhere? I kind of assumed 30fps.

ISP side was just my suggestion as to how it could work better, but I don't think they've released any details on where the servers are located.

That dude says it in the game trailer interview, first part.
 
I think it's a great idea for an arcade unit or something local but I can't see video being transmitted that consistently over the internet. Even at 480P, you have to consider that there is no buffering at all.
 
That dude says it in the game trailer interview, first part.

Didn't see that when I was at work. His comments are curious.

He mentions special low-latency data center, low-latency controller and a unique net code with small packets of info that allow responsive user input. That's pretty much as technical as it got. The client is small and runs in a browser.

The spectating features do sound pretty cool, especially from a developer standpoint, as he mentioned. Being a developer and watching people beta test your software is pretty cool.
 
With bandwidth cap looming or already in place? But this is what Kutaragi was blabbering about when he said consoles were going to disappear in the future. As a business I don't think it'll work. They are probably hoping to get bought by some big companies.
 
I'm sure it will work for at least some games now, and the number of games this will work with will continue to increase from there on. Will it be viable though? I have no idea! But it's sure an interesting way to do things, and good on them for trying. I'm thinking it will work, though I don't know yet if it can become commercially viable. Could save a lot of power though - at least if it works like similar projects for thin clients in the PC business space.
 
I'm sure it will work for at least some games now, and the number of games this will work with will continue to increase from there on. Will it be viable though? I have no idea! But it's sure an interesting way to do things, and good on them for trying. I'm thinking it will work, though I don't know yet if it can become commercially viable. Could save a lot of power though - at least if it works like similar projects for thin clients in the PC business space.

I am just thinking that the required power must be incredible. If you have 1 million people playing Halo 3 at the same time, the combined CPU+GPU power is staggering. I can´t really come up with a theory that would make it feasible to cost "cpu+gpu" costs just because i run it in a cloud. I would still need something that resembles the same power. A datacenter that can house 1 million Halo 3 players?
 
I'm skeptical of it working ,but if it does I'm entirely behind this concept in general. Make the platform and hardware irrelevant,focus on the content and accessibility.
 
I am just thinking that the required power must be incredible. If you have 1 million people playing Halo 3 at the same time, the combined CPU+GPU power is staggering. I can´t really come up with a theory that would make it feasible to cost "cpu+gpu" costs just because i run it in a cloud. I would still need something that resembles the same power. A datacenter that can house 1 million Halo 3 players?

PC World mentions that they'll have ultra-fast servers, but it's in the middle of a Terminator joke, so maybe it's not serious. It also says that they'll have 4 centers to cover the continental US, each one covering a 1000 mi. radius.
 
Putting aside the obvious issues like input lag and compression artifacts and lag, how about other stuff -- 5Mbps for HD? That'd eat into my bandwidth cap VERY quickly, and I've got the highest cap possible with my ISP (95GB/mo total).
Partly offtopic, partly related, but how common these bw caps are around the world? For example in Finland only connections I know with any caps are some of the universities connections which are free for the students, where they have a cap for "outside" bandwidth, but none for the university network(s) (i think it covers all the university networks in finland as it's "local" rather than outside)
 
dont see how this could work nicely. Either latency wll suck (by buffering a couple frames) or smallest hiccups on your line (which you always have on a shared line, be it cable or DSL) will "pause" the game.

Input -> transfer -> Server-PC running game -> capture and encode Video -> transfer -> decode Video -> Output
Sound like 200ms latency or more to me.

Sounds like insanity all around, Id find it more reasonable to be able to "lease" Hardware. Still more reasonable than having tons of highend PCs hooked up to some more PCs which can encode highres Mpeg4 on-the-fly.
 
Latency.

Many games already decouple rendering from the game loop and mutli-GPU rigs in AFR often receive complaint about lag. Not to mention a lot of displays have a fair bit of latency. So lets look at how a system like this could compound the issue. 30fps is ~33ms per frame (1000ms/30). And lets just assume all the current latency is tolerable.

66ms is a pretty decent ping (round trip), so lets strart with user input. User inputs a command and in addition to a typical setup the input will take 33ms to get to the server--1 frame. Assuming the setup is REALLY clean and there is <1ms from input, to proprietary software layer, to game, we do our game rendering as normal. When the frame is rendered it is going to be compressed and then streamed. How long is that going to take? Then another 33ms (another frame) to send it back to the user where the host system has to buffer and decompress the incoming media.

At 66ms ping we are already at an additional 2fps latency. Decompression/Compression as well as buffering are going to add non-trivial amount of latency. Assuming some really great technology 33ms for all the compression/decompession/buffering seems more than generous (we haven't even considered the effects of inconsistant bandwidth) so 3fps latency, or 100ms latency, is going to be really annoying in an action game. Toss in a non-gaming friendly display (20-80ms) and some net congestion and gah!!

I love the concept and using data centers regionally should help. The idea of a platform that is super cheap and can last a very long time, where games progressively get better with facillity upgrades, etc could be a killer blow. Imaging before the PS5 and Xbox4 could launch your "last gen" games were better than their launch games. At 5Mb/s we are only talking about 170k per frame (~2,250MB/hour; 2 hours of gaming a day for every day in a month is what, about 135GB? Getting close to that Comcast limit there...) so the concept sounds feasible.

If we were talking a single frame of latency above and beyond current games that could be addressed other ways, but not 3. That said some benefits:

* Cut out retailer cuts on games
* No more used market
* Better "per unit" effectiveness. e.g. Of 10M consoles, at peak hours maybe 30% will be utilized (number from thin air) that means you could invest 70% less in hardware or get more band per hardware buck
* Massively large online games become more plausible as your only concern is server ability and inbound traffic as outband traffic is only the frame
* No more hacking, RROD, or even new consoles. Perpetual platform

Heck, a daring company like Sony or MS could do this with their current platform if they wanted to.

Imagine of the Xbox3/PS3 was only a $100 software package... or the Wii2 :LOL:
 
Back
Top