Remote game services (OnLive, Gaikai, etc.)

I'm thinking that playing in HD or during peak hours will cost extra. Maybe require a subscription plan or even a lottery system. And I'm sure the more graphically impressive games will cost more to rent or buy than the less intensive games.

Did they say when the beta comes out?
 
As I mentioned earlier if a company like MS or Sony can get this to work (maybe populate dozens of data centers in the US alone?) they would essentialy be delivering a platform that evolves with a habitual userbase. It could offer a compelling upsell to games as every couple years the resources would grow in availability. And from a HW usage, even if peak load has 50% of users online at one time that means they could at the same investment pump out better perfomance. It could be multifaceted as well: slow gaming periods (working hours, night time) the resource pool could be leased out to businesses. As hardware gets faster and bandwidth better the service could see continued quality improvements. The idea of cutting out retail cuts and cutting into piracy is also big.

The concept is compelling, the technical hurdles are truly significant though. If a company was able to get a compelling solution together before the next console hardware it isn't unthinkable that such a company could literally deploy their new platform on their existing hardware. If you are MS and you can directly market to your 17M Live users, "Hey, for $xx you can play the Xbox 3 right on your Xbox 360" I think you have to cosider that.
 
If you are MS and you can directly market to your 17M Live users, "Hey, for $xx you can play the Xbox 3 right on your Xbox 360" I think you have to cosider that.

If this technology became standard there wouldn´t be a need for XBOX or PS consoles. The software developers could deliver games without caring for the platform, actually they could provide it themselves.
 
It is quite interesting how Onlive will deal with online multiplayer though. With friendslists, clans, etc. If it takes off, I would assume that they'll also integrate the players' buddies on IM chat services (gmail, yahoo, AOL) with onlive like xbox Live does because presumably most hardware that can run those chat clients can run Onlive.
 
More about Onlive, and a even a few new interesting infos:

http://news.bbc.co.uk/2/hi/technology/7976206.stm

The algorithm was developed on dual quad core Xeon processors, which cost thousands of pounds, but OnLive have said they have distilled it down so it can run on a custom chip which costs "under 20 bucks to make".
Mr Perlman said the chip was "high performance for video compression", running at less than 100Mhz clock speed and drawing about two watts of power.
"We can make millions of these things. Because of the economy there is plenty of excess capacity in fabrication plants."
 
Something I don't really get about this whole affair is: when they mention Perlman, they mention him as a founder of WebTV. Since when is this something to get excited about? Am I missing something about WebTV, did it do anything interesting besides being sold to Microsoft?

From the BBC article:
Every time you present new material to it, you will see something it does not compress so well. We note those and correct the algorithm.

Mr Perlman said it had taken "tens of thousand" of man hours to develop the algorithm.

"We are not doing video encoding in the conventional sense," explained Mr Perlman, dismissing an article in gaming website Eurogamer that said the service was unworkable.

"It's a very ignorant article," said Mr Perlman, who said Eurogamer had conflated issues of frame rate and latency.

"They are independent factors," he said.

Honestly, there should be an injunction prohibiting reporters from reporting on stories they don't understand. I like the BBC, but this was essentially them letting Perlman use the website as his blog.
 
Honestly, there should be an injunction prohibiting reporters from reporting on stories they don't understand.

Would this apply to Eurogamer too?

That BBC article said:
The company has calculated that each server will be dealing with about 10 different gamers, because of the varying demands games have on hardware.

Confirmation that it's not one physical PC per connected user.
 
Confirmation that it's not one physical PC per connected user.

I get the impression that they have no real idea on how they are gonna make a realworld multi million user setup, just yet. Their current system seems to work with standard PC´s with a "addon" card that simply compresses the videosignal. Nothing actually hints at anything smart, just yet. In other words if this is build around an elaborate TCP/IP KVM extender with hardware video compression it´s really bad.
 
The point of the Eurogamer feature was to attempt to understand how they might be doing it, and to address head-on the technical issues in a way that everyone else was skirting around for some reason. Perlmen asked for sceptism. In fact he demanded it, so he got it.

I do note that he has not responded to the issues presented with any kind of technical explanation whatsoever, even when asked directly.

The BBC news story is interesting in that he quote 80ms as maximum lag, defined as being from button press to movement on-screen (which would be my definition too). Yet even approximate analysis of the GameSpot feed pointed out earlier in this thread shows that it is much more than that. That's in his own demo.

More interesting still is this Joystiq interview. Check out the top of page 2 and see what you make of his explanation of the codec. Apparently 'walking' in a game causes contouring and they 'need to work on that.'

Confirmation that it's not one physical PC per connected user.

I'm more interested in how many CPUs and GPUs are being used per instance and also how the bandwidth ceiling inherent in PC architecture is being managed in order to run 10 game instances simultaneously. I am assuming that the current APIs and the use of video RAM wouldn't take kindly to constantly switching between different game instances. Multiple micro-systems on a single motherboard? Surely one GPU per instance?
 
Last edited by a moderator:
Would this apply to Eurogamer too?

Far be it from me to actually defend Eurogamer, but different situations altogether. That article was going on what we know so far regarding OnLive and the laws of the universe. OnLive is going 'we have a bunch of really smart guys working for a real long time, that's why it'll work (aka the Google approach)'. And the BBC is just nodding blankly and letting them speak.
 
I'm more interested in how many CPUs and GPUs are being used per instance and also how the bandwidth ceiling inherent in PC architecture is being managed in order to run 10 game instances simultaneously. I am assuming that the current APIs and the use of video RAM wouldn't take kindly to constantly switching between different game instances. Multiple micro-systems on a single motherboard? Surely one GPU per instance?

They mention (and were mocked for) 'custom silicon'. What if the intention's simply virtualization, with specialized hardware that mostly pretends to be a 'video card' performing the functions that can't be done in software efficiently? It's not like they want to show the frame on the server, it's not even likely that the server will be connected to any sort of display.

But still, even if I'm right, 10 users per server isn't a lot, and these servers clearly aren't commodity PCs. We're still back to the situation where each user will have the equivalent of $1000+ of hardware dedicated to them.
 
Custom silicon makes perfect sense IMHO and as I mentioned earlier, these servers will be multi-purpose and have quite a long shelf-life. Co-opt with the right partners and the cost falls significantly. Bearing in mind that Sony was willing to sponsor $200 to $300 per user for each PS3 sold, it's not beyond the realms of possibility, especially when 'obsolete' servers could go to another partner and perform very well doing less strenuous tasks. But the 10 users per server obviously introduces an element of scale to these datacenters that is somewhat frightening... but OnLive would know that.
 
Custom silicon makes perfect sense IMHO and as I mentioned earlier, these servers will be multi-purpose and have quite a long shelf-life. Co-opt with the right partners and the cost falls significantly. Bearing in mind that Sony was willing to sponsor $200 to $300 per user for each PS3 sold, it's not beyond the realms of possibility, especially when 'obsolete' servers could go to another partner and perform very well doing less strenuous tasks. But the 10 users per server obviously introduces an element of scale to these datacenters that is somewhat frightening... but OnLive would know that.

Sony lost a lot of money on that bet and survived mainly because they're Sony. Will OnLive have the cash to do the same, even if the system works?
 
They won't have the cash but investors will. They are already extremely heavily funded. I'm not kidding about the Space Race comments earlier.
 
Can this really get off the ground though? I really don't see it penetrating the market- or living up to it's various over-the-top technical promises......
 
They won't have the cash but investors will. They are already extremely heavily funded. I'm not kidding about the Space Race comments earlier.

Yeah, but I'm a lot more skeptical. We're reaching some awfully hard limits and in all likelihood there's nothing we can do about it. How many success stories do we have for loss-leaders? PS1 and PS2 were successful. Anyone else?
 
The BBC news story is interesting in that he quote 80ms as maximum lag, defined as being from button press to movement on-screen (which would be my definition too). Yet even approximate analysis of the GameSpot feed pointed out earlier in this thread shows that it is much more than that. That's in his own demo.
Is my math sound when I say in a locally-played tripple-buffered game running at 30FPS lag from input to showing up on screen is around 1000/30*3=100ms and half that for 60FPS. For doublebuffer it would be 67 and 33ms respectively. So they will need to get frame data to client screen in around 50-65ms. Would client-side doublebuffering be required aswell leaving even less time for actually moving frame data?
 
That 80ms just sounds plain infeasible to me. Warhawk's official servers have a ping of 60+ms and that's fairly good from what (little) I've seen of servers across games. The round-trip time from input to server to execute the game and render the screen to compress image to send it back to console to render on screen (which will need to be double buffered but i think that can hide some of the latency) cannot be made in 80ms IMO. If that figure is true at all, it must be for LAN distribution ignoring internet issues. Internet lag on consoles requires clever code to hide the latencies which some games manage better than others. There's no option for that with live, streamed video content. The client shows what it's received, and if there's a delay in the next frame, it's gonna glitch. Having said that, I'm used to PC games glitching frames all over the shop, so maybe for the majority of gamers without decent gaming rigs, it'll be acceptible :p
 
Back
Top