Remote game services (OnLive, Gaikai, etc.)

grandmaster said:
Let's not forget that the encoder has to encode in 1ms (well let's be generous and say it's actually 16ms)
It's at least 16ms(unless you expect them to deliver >60fps games on this), there's nothing generous about that. Simulating double-buffering, you get 16ms as long as your coder can work roughly in direction of V-Scan(which isn't rocket science). Simulating triple-buffered setup, you have a minimum of 16ms to work with, and you can work in any order.
 
Off-screen video of LEGO Batman, and then on the second vid, Mirror's Edge

http://www.gametrailers.com/player/47257.html?type=flv
http://www.gametrailers.com/player/47259.html?type=flv

It's the best off-screen camera footage I've seen. As you would expect, LEGO Batman looks OK - mostly dark colours, very little movement, it's like Silent Hill in that it's game video that can be compressed very easily. Mirror's Edge crashes and we only seem to get stored video streaming as the game loads and that looks very choppy.

Is there any other decent quality off-cam video?
 
I wonder also if their technology is set up to cut corners at the edge of the screen and preserve detail in the center.
 
Lag:
Any lag under under ~60ms can be almost completely transparent compared to a worst case scenario local 30fps game, provided you can render frames fast enough.
Tipically, user input is checked at the beginning of a frame. If the input arrives immediately after the check (worst case scenario) it willbe one frame before it is actually processed (30ms). Then the frame is rendered and at the next Vsync is presented, that's another 30ms.
So if you have a 60ms ping, assuming you can render a frame in 10ms, the scenario is: 30ms to receive the input, process it immediately instead of checking once per frame, render and encode the next frame, send it back. Total: 70ms.

Compression:
Compressing a game video is very different than compressing live video

- high coherency. Games have very few camera cuts, most of the time you are walking straight ahead or turning, both extremely easy scenarios to compress.
- camera position is known. No need to retrieve motion information from the scene. Big time saving.
- Z buffer is available. Most of the image can be accurately predicted from the camera motion and the depth information. Only occlusion and animation need to be taken into account.
- hardware assisted compression is probably an order of magnitude faster than any software solution, so comparing h246 on a PC makes little sense.

Not saying OnLive will work, but people saying it's impossible have very little imagination.
 
- camera position is known. No need to retrieve motion information from the scene. Big time saving.
Meh, global motion estimation is the easy part of motion estimation.
- hardware assisted compression is probably an order of magnitude faster than any software solution, so comparing h246 on a PC makes little sense.
Speed is really not that much of a problem even on PCs. As for cheaper, I doubt hardware assisted coding will be that. They would need to find an existing ASIC which suits their needs, or pay for a custom one (in which case it would be a long time to break even). As for FPGAs ... you pay orders of magnitude more per mm2 of silicon for a large FPGA than you do for a microprocessor.
 
Not saying OnLive will work, but people saying it's impossible have very little imagination.

I think the point is that most people are not saying that OnLive won't work, simply that it doesn't match the claims being made for it.

Perlmen defines latency as the time taken from pressing a button to the screen updating to reflect that action. He has stated that it takes 35-40ms generally, with a max latency of 80ms. This doesn't tally with his own GDC presentation - you can physically count the frames between fire button being pressed and muzzle flashes appearing on screen to give an approximate lag time. When a PS3 60fps game has a lowest possible 'real world' latency of around 50ms, the claims also sound like nonsense.

As you say, if the system worked as it is claimed to work, there would be no appreciable latency and yet every review thus far has commented on it being a factor.

Similarly with video compression, maybe you can factor back in elements of prediction using the technique you suggest, but you're still looking at streaming a set amount of video data across to the dumb terminal for decompression. Even giving h264 all of its predictive potential, 720p60 at 5mbps still looks very rough.
 
I think the point is that most people are not saying that OnLive won't work, simply that it doesn't match the claims being made for it.

It seems to me that a lot of people are saying that is just technically not feasible.

Perlmen defines latency as the time taken from pressing a button to the screen updating to reflect that action. He has stated that it takes 35-40ms generally, with a max latency of 80ms. This doesn't tally with his own GDC presentation - you can physically count the frames between fire button being pressed and muzzle flashes appearing on screen to give an approximate lag time. When a PS3 60fps game has a lowest possible 'real world' latency of around 50ms, the claims also sound like nonsense.

As you say, if the system worked as it is claimed to work, there would be no appreciable latency and yet every review thus far has commented on it being a factor.

He is probably talking about additional lag, not total lag. From what I heard about the GDC presentation, there is a bit of a lag but it's not problematic. This is consistent with his claims.

Similarly with video compression, maybe you can factor back in elements of prediction using the technique you suggest, but you're still looking at streaming a set amount of video data across to the dumb terminal for decompression. Even giving h264 all of its predictive potential, 720p60 at 5mbps still looks very rough.

I agree, this is the part that worries me most. It seems that the image quality is pretty bad. I wonder if they will stream higher quality 480p on 5Mb, if the user wants. A scaled down 480p from a 1080p signal will probably look better than an overcompressed 720p stream.
Also, are people with faster bandwidth (> 5Mb) going to get a better quality signal?
 
Is there any indication what happens to audio and sound effects. Do they provide more than 2-channel stereo experience?

Some people, well bluray movie freaks, consider a true HD experience comes from the audio.
 
Is there any indication what happens to audio and sound effects. Do they provide more than 2-channel stereo experience?

Some people, well bluray movie freaks, consider a true HD experience comes from the audio.

It´s part of the complete package, however, us "bluray freaks" wouldn´t accept the Onlive Video quality anyway.
 
He is probably talking about additional lag, not total lag. From what I heard about the GDC presentation, there is a bit of a lag but it's not problematic. This is consistent with his claims.

Well the BBC quote from Perlman is quite specific.

Perlman said:
"The round trip latency from pushing a button on a controller and it going up to the server and back down, and you seeing something change on screen should be less than 80 milliseconds. We usually see something between 35 and 40 milliseconds."

It is difficult to accurately measure from the 30fps video of the GDC presentation but it's actually closer to 166ms-200ms. Bearing in mind double-buffering and triple-buffering Perlman's claim simply isn't feasible.

The BBC article also says that the games will be running from "off the shelf motherboards" which is a comment I'd missed before and is also puzzling. If that is the case, then I really would like to see ten game instances running on a single PC. Surely even a quad socket i7 with quad GPUs is going to hit a bandwidth ceiling running ten game instances?

http://news.bbc.co.uk/1/hi/technology/7976206.stm

Perlman's mouth seems to be OnLive's biggest liability.
 
Well the BBC quote from Perlman is quite specific.


It is difficult to accurately measure from the 30fps video of the GDC presentation but it's actually closer to 166ms-200ms. Bearing in mind double-buffering and triple-buffering Perlman's claim simply isn't feasible.

Hmm, I didn't read that quote before. Yes, that's obviously impossible.

The BBC article also says that the games will be running from "off the shelf motherboards" which is a comment I'd missed before and is also puzzling. If that is the case, then I really would like to see ten game instances running on a single PC. Surely even a quad socket i7 with quad GPUs is going to hit a bandwidth ceiling running ten game instances?

My understanding is that they will have one machine for every ten users, so only 1/10 of the users are going to play at any time. Although that quote seems to refer to 10 instances of the same game running one one server... You would need multiple graphic cards for that to work, you might as well have a bunch of cheaper stock PCs.
If you estimate a cost of $1000 for a single node, and you assume that only 1/10th of your users will be using the service at any time, you can recoup the cost by charging each user $100. Of course they will need a tier system to avoid shortages during peak hours...
 
The BBC article also says that the games will be running from "off the shelf motherboards" which is a comment I'd missed before and is also puzzling. If that is the case, then I really would like to see ten game instances running on a single PC. Surely even a quad socket i7 with quad GPUs is going to hit a bandwidth ceiling running ten game instances?

Looking at Core i7 memory benchmarks (different speeds, dual channel vs triple channel) you should be able to run multiple games per memory controller, especially as you'll be capped at 60fps. They'll also be able to pick and choose which combinations of game to run with each other to avoid bottlenecks.

Most of the bandwidth used on graphics cards is related to the frame buffer, and there will only be relatively low resolutions being used (relative to what the cards are capable of).

And for both CPU and GPU they'll be able to tweak game settings to avoid particular issues and limitations.
 
Hmm, I didn't read that quote before. Yes, that's obviously impossible.

It's not impossible - the 80ms figure is actually very do-able.

Latency from the tv and the game buffering inputs could cause problems, but I guess publishers will work with them on the input thing and TV side issues are considered your problem (although it's their problem if you blame them).

On a fast web connection that can transmit each frame in less than 16 ms, and a server that (while capped at 60fps) can generate frames faster, there's no reason why the OnLive box shouldn't have a new frame waiting for your tv in significantly less than 80ms.

Double buffering should be at most 16.7ms of the total latency, and triple buffering no extra.

In practice things might not work out, but in theory there's nothing wrong with what they're saying. And if things don't work as planned they can only get better with time...
 
Looking at Core i7 memory benchmarks (different speeds, dual channel vs triple channel) you should be able to run multiple games per memory controller

What? What exactly is this "speed"? Bandwidth?
Multiple instances will be battling for latency and not bandwidth. There is no way, known to men, to decrease latency.
 
Ha! I talk about the possibility of such technology in a thread a year and a half ago. Everyone back then thought the idea was dumb. True that I only mentioned on-demand remote gaming in passing, but the technology involved is the same.

I think a system like OnLive only makes sense if there's the option to stream from a local computer. Basically, you have to get the hardcore players to pay much of the hardware cost. Otherwise it isn't viable economically. You won't find enough casual players willing to subsidize those who play all day.

Here's how I imagine it'd work:

1. You see a new game and decide to give it a try. The game instantly start to stream to your set-top box.
2. The video is low-res and jerky, and the lag is annoying, but you like the game enough to want an enhanced experience.
3. You select the option to buy the game. The software checks your computer's hardware to see if it's powerful enough. Then it starts the downloading process.
4. You put up with the laggy slideshow for a bit longer. After a couple hours, the game is downloaded and can be rendered locally instead.

Getting back to my original point, I think the only platform flexible enough to do this is the PC. Software-wise, it makes sense. On the server-side, it's most economical to go the x86 route. On the client-side, the PC is omnipresent and gamers have demonstrated already their willingness to pay a lot for extra quality.
 
What? What exactly is this "speed"? Bandwidth?
Multiple instances will be battling for latency and not bandwidth. There is no way, known to men, to decrease latency.

How does running multiple instances affects latency? As long as you can render all frames within 30ms, there is no latency issue.
 
I think a system like OnLive only makes sense if there's the option to stream from a local computer. Basically, you have to get the hardcore players to pay much of the hardware cost. Otherwise it isn't viable economically. You won't find enough casual players willing to subsidize those who play all day.

But that's sorta the main problem with the idea. Why would 'hardcore' gamers pay so much for an experience that is doubtless going to be inferior to running the game on their PC (or console)? The 'good enough for most' angle works for all but the hardcore. But the only people I can see paying a monthly fee for gaming are the hardcore.

Your idea is interesting, but it's not really what they're proposing, considering part of the set-up is the tiny set-top box.
 
But that's sorta the main problem with the idea. Why would 'hardcore' gamers pay so much for an experience that is doubtless going to be inferior to running the game on their PC (or console)? The 'good enough for most' angle works for all but the hardcore. But the only people I can see paying a monthly fee for gaming are the hardcore.

Your idea is interesting, but it's not really what they're proposing, considering part of the set-up is the tiny set-top box.

well, depends on how the fee would be, what would be offered etc - i mean, wow has now, what, over 11 (or was it over 12 already?) million players, which of ~5-6 are from asian countries which don't pay monthly fees - this still leaves us ~5-6 million people paying monhtly fees for single MMO game, and you can be damn sure they're not hardcore players, not by a long shot
 
well, depends on how the fee would be, what would be offered etc - i mean, wow has now, what, over 11 (or was it over 12 already?) million players, which of ~5-6 are from asian countries which don't pay monthly fees - this still leaves us ~5-6 million people paying monhtly fees for single MMO game, and you can be damn sure they're not hardcore players, not by a long shot

Well, okay. I was imagining a larger fee and a much more focused array of games. It's hard to generalize from WoW since no one has been able to replicate that, but point well taken. Still, I have some problem with this concept. The array of games has to be a lot more general than what they're promising, but they still have to keep the quality of the games up, otherwise there's little advantage to using this system over, say, Kongregate.
 
Back
Top