ShadowRunner
Veteran
80ms lag on top of the game latency, presumably.
As well as display latency.
80ms lag on top of the game latency, presumably.
I guess you would need to keep up a subscription, to pay for the bandwidth, and you would get another charge for every game that you "buy" or rent which most likely depends on how graphically demanding the game is. If the total cost to the consumer is higher than buying the games at retail, then it may not be worth it for heavy PC gamers. They've shown that the technology can basically work so now they just need to make it cheap.And won't make enough money to justify the streaming costs...
I do have a fairly large concern about OnLive, after watching this video...
Hardware is expensive. Massive farms of high end PCs for high end graphics is very expensive. Updating them every 6 months... Even more expensive.
Aside from the possible technical hurdles already addressed, I don't the appeal.
That expense must get past on to the publisher in some form, be it a higher royalty, etc.
My concern is simple: If this is the case, and logically it is, will publishers scale back the games to run on cheaper hardware? It makes economic sense.
*They say it takes 1 millisecond for them to compress the video and which altogether gives the player about 80 ms latency, which is pretty acceptable.
I'm not very knowledgeable about video compression, but that 1ms figure definitely raises my eyebrow. Is that actually realistic for a 1280x720 video stream?
I would think it would have the opposite effect.
It would vastly increase the number of potential users with adequate hardware to view the upper tier of graphical settings. You'd probably see a couple orders of magnitude increase in the number of users able to play a new game at it's absolute best settings. Surely this would increase the incentive to aim for the highest possible quality, since publishers would view the investment as more worthwhile.
I'm not very knowledgeable about video compression, but that 1ms figure definitely raises my eyebrow. Is that actually realistic for a 1280x720 video stream?
Appearently they have some new video tech, with custom hardware for compression.
Also bear in mind that they had a Macbook Air doing the decoding at GDC without using hardware acceleration... you have to ask yourself what kind of codec they must be using to decode 720p60 on a 1.8GHz Core 2 Duo. CoreAVC would probably just about do it.
What transmission bandwidth target have they got?
Are they really aiming for 60fps? I would assume they would have a lover framerate......
So 1ms to compress a 720p video feed to 5mbps? Where existing methods take 30x that. And with so little time to spend at the problem, zero optimisiations for quality can be applied."SD" which hasn't been defined in terms of resolution or frame rate is at 1.5mbps, and HD which has been defined as 720p60 at 5mbps peak, but often lower!