Remote game services (OnLive, Gaikai, etc.)

And those who wear glasses have additional 5ms of latency, 7 if they are sun glasses. :)

All in all, great presentation. I really want to see some game that is builtfrom the ground ou for the OnLive. Something massive and spectacular, that not a single one desktop machine could render.
 
It could have some potential as an Indie games server, or maybe even a mobile games server for smartphones, but those games are usually not too graphically demanding.
 
I was very impressed by the presentation. I would have liked to have been there, to ask some more critical questions.
I do have a fairly large concern about OnLive, after watching this video...

Hardware is expensive. Massive farms of high end PCs for high end graphics is very expensive. Updating them every 6 months... Even more expensive.

That expense must get past on to the publisher in some form, be it a higher royalty, etc.
My concern is simple: If this is the case, and logically it is, will publishers scale back the games to run on cheaper hardware? It makes economic sense.

If you watch Burnout on the monitor, the game appears to be running with significantly reduced graphics settings (even compared to the PC). I saw no shadows and few visible post processing effects (then again, it's a tiny monitor). It makes me wonder...

I also noticed the multiplayer pings in Crysis were quite high.. 100-200, considering the way the system is setup I assumed they would all be practically zero.
Funny he mentioned he could 'feel' 30ms of lag from the wireless console controllers too. :p
 
I'm assuming that this is basically like owning a cable box. You don't own the content but are free to use a range of software offered.

Aside from the possible technical hurdles already addressed, I don't the appeal. I like owning software, be it physically or digitally, and this seems like a nice option for demos or rentals at best.
 
And won't make enough money to justify the streaming costs...
I guess you would need to keep up a subscription, to pay for the bandwidth, and you would get another charge for every game that you "buy" or rent which most likely depends on how graphically demanding the game is. If the total cost to the consumer is higher than buying the games at retail, then it may not be worth it for heavy PC gamers. They've shown that the technology can basically work so now they just need to make it cheap.
 
I do have a fairly large concern about OnLive, after watching this video...

Hardware is expensive. Massive farms of high end PCs for high end graphics is very expensive. Updating them every 6 months... Even more expensive.

Most PC games don't reguire high end hardware for good graphical settings anymore, mainly because of consoles. 3 year old nVidia 8800 still runs games pretty good, especially at resolutions like 720p. PC updating is not anymore what it once was. There is going to be bigger updating costs during the console generation change, but that's like every 5-7 years. I understood they are not planning to own the computing clouds, atleast in the beginning.
 
Aside from the possible technical hurdles already addressed, I don't the appeal.

Playing the same game on n number screens (pc, laptop, ipod, tv) has some pretty obvious appeal imo.

Ownership is a bit of an issue, but you can still purchase a game as I understand it. You'd have to have a bit of faith in onLive existing as a company, but my typical active playtime for a game is never greater than a year, and that's only for sports games, so I don't think it would be a big issue with me.

Would probably beat buying the PS4 for $500 and waiting in the cold to get one, when I could just buy the 2 best games on day 1. Kinda kills the fun factor though :p

This only thing that worries me is the compression algorithm, sounds a little too good to be true, how's it really gonna look on hi resolution monitors...
 
That expense must get past on to the publisher in some form, be it a higher royalty, etc.
My concern is simple: If this is the case, and logically it is, will publishers scale back the games to run on cheaper hardware? It makes economic sense.

I would think it would have the opposite effect.

It would vastly increase the number of potential users with adequate hardware to view the upper tier of graphical settings. You'd probably see a couple orders of magnitude increase in the number of users able to play a new game at it's absolute best settings. Surely this would increase the incentive to aim for the highest possible quality, since publishers would view the investment as more worthwhile.
 
*They say it takes 1 millisecond for them to compress the video and which altogether gives the player about 80 ms latency, which is pretty acceptable.

I'm not very knowledgeable about video compression, but that 1ms figure definitely raises my eyebrow. Is that actually realistic for a 1280x720 video stream?
 
I'm not very knowledgeable about video compression, but that 1ms figure definitely raises my eyebrow. Is that actually realistic for a 1280x720 video stream?

Appearently they have some new video tech, with custom hardware for compression.
 
I would think it would have the opposite effect.

It would vastly increase the number of potential users with adequate hardware to view the upper tier of graphical settings. You'd probably see a couple orders of magnitude increase in the number of users able to play a new game at it's absolute best settings. Surely this would increase the incentive to aim for the highest possible quality, since publishers would view the investment as more worthwhile.

Encode Crysis at max settings at 5mbps at 720p60 and it most certainly wouldn't be "the highest possible quality" no matter how much fairy dust your codec uses. As it is, OnLive must almost certainly be using something along the lines of an h264 iteration of Microsoft's smooth streaming.

I'm not very knowledgeable about video compression, but that 1ms figure definitely raises my eyebrow. Is that actually realistic for a 1280x720 video stream?

Only if you're using MJPEG or some other ultra-destructive, not particularly bandwidth efficient encoder. Bear in mind that $50,000 h264 encoders don't do this and OnLive's encoder is apparently a plug-in PCI Express board. I doubt even a CUDA-driven GTX295 could encode a 720p frame in 1ms in any kind of bandwidth efficient manner. We are, after all, talking about an encoder that can apparently encode 720p at 1,000 frames per second! To put that into perspective, FixStars' h264 encoder that uses the PS3 CELL manages about 40 frames per second.

Appearently they have some new video tech, with custom hardware for compression.

Also bear in mind that they had a Macbook Air doing the decoding at GDC without using hardware acceleration... you have to ask yourself what kind of codec they must be using to decode 720p60 on a 1.8GHz Core 2 Duo. CoreAVC would probably just about do it.
 
Last edited by a moderator:
Also bear in mind that they had a Macbook Air doing the decoding at GDC without using hardware acceleration... you have to ask yourself what kind of codec they must be using to decode 720p60 on a 1.8GHz Core 2 Duo. CoreAVC would probably just about do it.

Are they really aiming for 60fps? I would assume they would have a lover framerate......
 
What transmission bandwidth target have they got?

"SD" which hasn't been defined in terms of resolution or frame rate is at 1.5mbps, and HD which has been defined as 720p60 at 5mbps peak, but often lower!

Are they really aiming for 60fps? I would assume they would have a lover framerate......

60 frames is the stated frame-rate, yes.

I do buy into streaming gameplay. There's not much about Gaikai that I can really take issue with. There are no ker-azy claims there. You see the demo, and you see how plausible it is.

But it's stuff like 720p60 at 5mbps and 1,000fps encoders that just call OnLive claims completely into question. 720p30 at 5mbps, even with the ultra-small buffers they must use for low latency, is entirely believable.

OnLive's target market isn't even going to be conditioned to 60fps gameplay any way - they will be casuals and console migrants - neither of whom will need 60fps any way.

Latency? Well, once I get access to the system I'll be happy to measure it, but if you go back to the original GDC 09 presentation where they're playing Crysis, the latency between Mike McGarvey's mouse movements and on-screen action as well as Steve Perlman's trigger button presses, suggest that it is not negligable.
 
"SD" which hasn't been defined in terms of resolution or frame rate is at 1.5mbps, and HD which has been defined as 720p60 at 5mbps peak, but often lower!
So 1ms to compress a 720p video feed to 5mbps? Where existing methods take 30x that. And with so little time to spend at the problem, zero optimisiations for quality can be applied.

What's your well-educated estimation of what image quality going be like? I've seen 5mbps 720p films and they've looked okay, although not pristine. But certainly comparable with MPEG2 DVDs. I'm guesing blocking and posterisation will be significant, and perhaps in dark clips things could be very hard to see, but video compression is a black art to me and I have very little worthwhile experience on which to base guesses!
 
Doesn't matter if it's MPEG2, MJPEG or h264, there are some basic truisms about compression. Firstly, if the screen is mostly static or has slow motion, you can get away with absurdly low bandwidth. The more flat colour and lack of detail the easier and more bandwidth efficient the encoding can become.

LEGO Batman is one of the OnLive demo games and I reckon you *could* get 720p60 at 5mbps looking fairly OK with that, and almost pristine at 30fps.

A colourful game moving at fast speeds like Burnout Paradise (another OnLive demo) is the complete opposite - a total nightmare to encode. 5mbps at 30fps will still look pretty ropey. Crysis is a nightmare to compress too.

My guess is that there will be different profiles for each game to make them look as good as possible at the stated bandwidth levels. It wouldn't surprise me if they go sub-HD either, depending on the game. I mean, if your detail level is going to be destroyed any way by the encoding, there's no point maintaining 720p to begin with.
 
Back
Top