Remote game services (OnLive, Gaikai, etc.)

With the claims being made, frankly, yes.

The point here is that we are supposed to believe the following:

Firstly, that OnLive has revolutionised video encoding. Fine, OK. They've bettered the best that Sony, Panasonic and Microsoft have achieved in the field of video compression, and they're keeping this incredible innovation to themselves for the purposes of streaming video gameplay. Fine. No problem.

Secondly, that they've come up with a system of beating latency that has so far eluded the very best experts in the field for years.

And finally, that they can somehow pay for thousand upon thousand (millions even) of high-end PCs. And not only that, but they're running proprietary video encoders in each one of these units that out-perform the $50,000 realtime h264 encoders used for live HD broadcasts, encoding with just 1ms of latency (!!).

I want to believe in OnLive because as a gamer I think that the things it is supposed to do are hugely exciting. Plus, I work in video day-in, day-out and this is the most amazing use of video ever seen.

But the bottom line is that at some point you need to factor in Occam's Razor. The more you look at it, the more improbable the whole thing becomes.

But if you look above you can see that it aparently works, unless its a giant mockup. My wild guess would be that they are encoding each frame instead. I played around with JPG and i can get a ugly blocky picture that takes up 12kb, they have 7.5KB or so pr frame.
 
But if you look above you can see that it aparently works, unless its a giant mockup. My wild guess would be that they are encoding each frame instead. I played around with JPG and i can get a ugly blocky picture that takes up 12kb, they have 7.5KB or so pr frame.

I don't think there's much doubt that it works on a superficial level, it's more about the quality of the experience and scalability. 1 guy running off a server 50 miles away on supposedly an ideal connection mentions noticeably lossy video and only barely noticeable lag. What happens when it's 1000 people playing on servers hundreds of miles away, each playing a different game and encoding a different video?
 
Okay - reality check.

This platform would have to support well over 100+ games served around the nation no just ONE game to meet the same demand a single console meets today.

Let's be generous and say they can get acceptable quality at 5mps for video and most gamers are only on for a modest 2 hours a day.

5mbps * 60sec * 60min * 2hr * 7days * 100games * 1,000,000gamers =
25,200,000,000,000mbps =
3,076,171,875GB per week.

3600000mpbs =
439,453,125GB per day.

This doesn't account for lost/corrupted packets OR any other data than the video being sent the the user itself. Acks, user input, save files, patches(something must play the video),...AUDIO (5.1 or better???)...etc is not accounted for. Daily peak usage and spikes for new releases are not yet accounted for.

They will need INCREDIBLE infrastructure to approach making this work on the same echelon as what consoles deliver now.


Dude, I don't think your calculation makes sense.

First, It was stated that the average throughput would be 2mbps (5mbps peak).

Second, Your equation goes wrong with "x 100games". Are you saying that each gamer is going to play 100 games 14 hours each in one week? Or are you saying that each of the one hundred games will have 1,000,000 players playing it 14 hours a week? I don't get it.

Try this:

2mbps x 60sec x 60min x 2 = 14400mb per day per gamer

(14400 x 365days)/12 = 438000mb per month per gamer

438000mb x .125 = 54750MB per month per gamer

54750MB/1024 = 53.467GB per month per gamer

53.467 x 1,000,000 = 53,467,000GB per month per 1 million gamers.
 
Last edited by a moderator:
I don't buy that 2mbs is close to sufficient. I'm not a professional but I've encoded quite a bit of video using little more than graphedit and ffdshow were I wrote out quantizations etc myself. 2mbs isn't likely to cut it...its possible but in my experience very very unlikely.

I was being quick and dirty with that calculation.

I'm thinking that 1,000,000 players out of nearly a hundred million or so persons who have purchased a console in North America alone will want to play their games that they purchased for a few hours in the day.

Note that is a conservative number given just the players from Halo + Killzone would get you close to 1,000,000 *online* players when the vast majority of gamers play but do not go online at all.

You could replace x100 with 2 games if you like.

I was attempting to be conservative with the numbers...it wouldn't take much to be extremely more fatalistic...for instance tossing in world wide requirements or making a more realistic estimate of the number of individual who play a game in a day...or accounting for hardcore gamers which play for several hours in the day on a consistent basis...or including the plurality of children who are parked in front of a "console-babysitter" each and every day. In any case, my conclusions don't change much.

Some INCREDIBLE infrastructure is required to meet the same demand that a *single* console does in a day and for all intents and purposes matching the same quality is virtually impossible.
 
Last edited by a moderator:
Isn't the 720p60 supposed to run on a 5mbps connection?


Yes, but I guess it all depends on the game. From what it sounded like, the high-end games require more than the lower end games. The average game would require an average of 2mbps in 720p, and higher end games peaking 5mbps? That's what was said in the conference.
 
Yes, but I guess it all depends on the game. From what it sounded like, the high-end games require more than the lower end games. The average game would require an average of 2mbps in 720p, and higher end games peaking 5mbps? That's what was said in the conference.

High end or low end won't matter. What will matter is the ratio of moving pixels to still pixels and how well the encoder can predict changes from one state to the other among other things of course.

A low end game with low quality but random explosions popping off continuously in a perpetually high contrast shifting environment could wreak havoc for instance.
 
Last edited by a moderator:
Yes, but I guess it all depends on the game. From what it sounded like, the high-end games require more than the lower end games. The average game would require an average of 2mbps in 720p, and higher end games peaking 5mbps? That's what was said in the conference.

But what comprises a higher-end game? Anything with action? It's all pretty vague, which was my problem with the part of the presentation I saw.
 
Heh. Only they've said that the hardware encoder is a expansion card in the server PC. Which also intimates that each client has its own dedicated PC to connect to ...

Err, why?

If the encoder card can encode a frame in 1 millisecond, why couldn't it also encode frames from other instances of games running on the machine? Why do you think each encoder card is only able to encode video from one instance of one game?
 
Err, why?

If the encoder card can encode a frame in 1 millisecond, why couldn't it also encode frames from other instances of games running on the machine? Why do you think each encoder card is only able to encode video from one instance of one game?

Look at the games being mooted in the front end demo. How many of those could you run at 720p60 simultaneously even on the most high-end PC hardware? And you need to be realistic here - the encoder card isn't encoding at 1ms. Heavily parallelised MJPEG encoding done on hardware, yes. HD at 5mbps? That's just ridiculous. It's when claims like this are being made that you have to question the whole thing.

High end or low end won't matter. What will matter is the ratio of moving pixels to still pixels and how well the encoder can predict changes from one state to the other among other things of course.

A low end game with low quality but random explosions popping off continuously in a perpetually high contrast shifting environment could wreak havoc for instance.

Correct. The notion of high-end or low-end games requiring different levels of bandwidth only works if your low-end game is something with a mostly static screen - for example, a poker game. In which case, why use video at all? Flash or Java would make more sense.

720p60 at 5mbps is barely watchable... 2mbps would be like FMV on the Sega CD on all but the most static of scenes.
 
Last edited by a moderator:
I am surprised some people are having a problem with the hardware involved in such a setup. One million PCs for one million simultaneous users? Why not. Console manufaturers spend far more on creating their console hardware over the course of the generation. This should be far cheaper and gives you far more flexibility on how you design the hardware. You no longer have to have one cpu and one gpu in a single 'pleasing' enclosure.

I wonder what resolution these games are being ran at. A standard definition stream might not look so bad if they are downsampling from a 1600x1200 image. Or even a 720P source image. It's basically supersampling. (Of course there are also the compression artifacts.)

I really do believe all concept of localized computing outside of computing centers will pretty much be gone within 10 years. Once you get to the point of being able to stream HD interactive content to a screen 30 times a second you no longer need to upgrade at that end of things. You can put that computing power whereever you want. Gone will be the days of having PC sit idle the vast majority of the time.

However, I do think that it may be a bit early still for this to really take off. 5mb/s? It just doesn't seem like enough.

It would be interesting to think about other console manufacturers going this route. Live + this? Wii playing amazing looking games? Very interesting stuff.
 
Look at the games being mooted in the front end demo. How many of those could you run at 720p60 simultaneously even on the most high-end PC hardware? And you need to be realistic here - the encoder card isn't encoding at 1ms. Heavily parallelised MJPEG encoding done on hardware, yes. HD at 5mbps? That's just ridiculous. It's when claims like this are being made that you have to question the whole thing.



Correct. The notion of high-end or low-end games requiring different levels of bandwidth only works if your low-end game is something with a mostly static screen - for example, a poker game. In which case, why use video at all? Flash or Java would make more sense.

720p60 at 5mbps is barely watchable... 2mbps would be like FMV on the Sega CD on all but the most static of scenes.

Whatever they are doing, it works, and it seems to look better then i expected. Since Wii is doing so good and people are actually watching videos on Youtube i don´t see why the same audience wouldn´t be interested.
 
Does anyone know what kind of connection is used between their datacentre and the boxes they play on? Is it anywhere close to the 5Mbit and normal latencies?
 
Whatever they are doing, it works, and it seems to look better then i expected. Since Wii is doing so good and people are actually watching videos on Youtube i don´t see why the same audience wouldn´t be interested.

But that's sort of the point, isn't it? I haven't seen a full launch setup, but in the graphic they show a bunch of high-end, big-name FPS titles like Crysis and Far Cry 2. There's also Lego Batman and Prince of Persia, but the focus seems to be on the former, rather than the latter.
 
And finally, that they can somehow pay for thousand upon thousand (millions even) of high-end PCs. And not only that, but they're running proprietary video encoders in each one of these units that out-perform the $50,000 realtime h264 encoders used for live HD broadcasts, encoding with just 1ms of latency (!!).

Has it been claimed that they are running that many high-end PCs ? Can it not be a huge resource pool ? Thats generally what I believe cloud-computing is about, kind of like expanded virtualization that spans across resources ala IBM's system-P or system-i or even mainframe setups.
 
Does anyone know what kind of connection is used between their datacentre and the boxes they play on? Is it anywhere close to the 5Mbit and normal latencies?

One of the devs mentioned that on the GDC booths they are using a single 6mbit connection for 3 OnLive users.
 
Back
Top