CES 2014: JANUARY 7-10, 2014

Gaikai and Onlive were around for years to try, so none of what Sony is doing is science fiction. I tried both and both worked fine from the Bay Area. The user has to be within a specified distance for the service to work well. They can service most of the US (population) from data centers on both coasts.

Most games are not twitch, those are not the games you would want to use cloud rendering on.
 
It's kind of interesting that if you do the math, streaming games at this point may be at a bandwidth disadvantage, depending on the game and how much you play. There are still a lot of games that are 25GB and under, and if you're using about 2GB per hour for HD streaming, it doesn't take long to use up the same bandwidth you would have use downloading the game. For smaller games, you'll blow past the bandwidth it would have taken to download the game. The advantage is really just not having to buy the hardware, and probably a very affordable price for playing older games.
 
OnLive didn't have enough servers. Was told they didn't build out their server infrastructure aggressively while Gaikai did.
 
What's the absolute best case scenario for such a service if there were magically fiber everywhere tomorrow? How far down can the latency go? Basically, I'm asking if physics actually supports this use case for all games (including shooters), because if there are "no more consoles", then it needs to support that case to. You've got to think about the loop: button press, latency sending data to server, responding to that action, rendering time, then encoding time, then transit time over the Internet, decoding, and displaying on screen, etc.. which may be what, 500ms? 1s?

Then realize the world isn't full of fiber in people's homes and maybe double or triple that for the average person. Sigh.
 
The problem I found with onlive is that latency isn't a constant. If it was always 30ms you could get used to it and deal with it and it would be a bit like not having a game mode on your TV. If Gaikai has found a way around that issue I'd love to see it, otherwise for me it's probably unplayable.
 
The problem I found with onlive is that latency isn't a constant. If it was always 30ms you could get used to it and deal with it and it would be a bit like not having a game mode on your TV. If Gaikai has found a way around that issue I'd love to see it, otherwise for me it's probably unplayable.

Doubtful, since there is no way to plausible guarantee latency across the internet, especially when you are talking OTT services like this.
 
OnLive didn't have enough servers. Was told they didn't build out their server infrastructure aggressively while Gaikai did.
If am right Gaikai had a different business plan on how server infra was built. They licensed a streaming technology and publisher, EA or other big player, would install libs on a server hotel of they own. Publisher run a time limited games on a webpage(flash or java plugin playing h264 game stream) and let users buy a gaikai or offline Origin installation license. Gaikai was a way of promoting and providing demo versions. OnLive has dedicated server hotels where they operate everything, purchases would alway be just Onlive versions of each game.

Is it possible Sony could be using h.265? Can PS3 decode that at 720p?
Or WebM(VP9, Opus audio) codec as few Sony BraviaTV 2014 and other new Sony media devices support VP9 video codec. Well ok I guess odds are its still h264+aac or mp3 tiled streaming.
VP9: One of the goals is to reduce the bit rate by 50% compared to VP8
Opus audio: particularly suitable for interactive real-time applications over the Internet.
 
Last edited by a moderator:
What's the absolute best case scenario for such a service if there were magically fiber everywhere tomorrow? How far down can the latency go? Basically, I'm asking if physics actually supports this use case for all games (including shooters), because if there are "no more consoles", then it needs to support that case to. You've got to think about the loop: button press, latency sending data to server, responding to that action, rendering time, then encoding time, then transit time over the Internet, decoding, and displaying on screen, etc.. which may be what, 500ms? 1s?
Have you not followed the existing methods? this isn't an unproven, theoretical service. OnLive and Remote Play already work without 500-1000 ms lag. Online gaming already has 30-50 ms games, depending on circumstances.

For real use, the major issue is reliability of internet connection (does BW wobble all over the place making for poor video?) and latency. Latency is an issue of how many hops the connection passes through. As technology improves, it's certainly possible to get that down low. So the physics does actually support the use case for all games. It's scientifically possibly to get remote gaming with, say, 20 ms additional latency. Whether its economically possibly is the debate, given economics of creating an internet that supports it.
 
Well maybe, hopefully, one day we will all use this beauty.

I think that putting this in the US would just cost, 40 billion dollars or something more.

3.5dB/km is quite high, as the article says, might not be a thing for long hauls, but is probably fine for most lastmile access links. Depending on your design, but even if the latency is reduced on this cable, the photon/electricity converter might not be able to keep up.

As a rule of thumb, you can never go faster than the speed of light, so that is the starting point, for calculating the minimum latency that will always be there.
 
its jsut not going to work, the second you hit a router full of copper your just going to fill the ingress buffers if your >~70% of link throughput.
 
I doubt they created it knowing that it's never going to work :rolleyes:

so how do you make electrons go faster inside all that copper wiring inside the smart bits. you would need an end to end optical system. they would have to add inter frame spacing or something like that with exsisting systems. The other point is that 8nm SMF as it stands doesn't have a throughput limit, so unfortunatly the ROI on replacing ~3million km of cabling just to on average drop your latency by 1/2 ( processing/switching delay across all the devices in a path is significant) doesn't seem like it's ever going to get off the ground .
 
Wasn't Sony in discussions with internet providers around the time of E3 last year, is it possible that those discussions would in part be about how to guarantee or work towards certain tolerances which would make something like PS NOW more viable?
 
so how do you make electrons go faster inside all that copper wiring inside the smart bits. you would need an end to end optical system. they would have to add inter frame spacing or something like that with exsisting systems. The other point is that 8nm SMF as it stands doesn't have a throughput limit, so unfortunatly the ROI on replacing ~3million km of cabling just to on average drop your latency by 1/2 ( processing/switching delay across all the devices in a path is significant) doesn't seem like it's ever going to get off the ground .

For Playstation Now, probably not a viable business case. But for certain niche markets, for instance high frequency trading every ms advantage you get, is worth tons of money. So it might se the light of day in those type of settings, because latency is key there.
 
No im not talking about playstation now, im talking about magic new fibre that runs ~99% of C. The rest of your equipment cant run electrons that fast and in high throughputs you only have ~30-100ms worth of buffers ( however you get data into those at that speed). But the other point is there is a lot more "device" delay in networking then people realise, even on a path like Canberra (AU) to San Jose which is a round trip of ~24,000km with very few devices when considering the distance your looking at just as much of the RRT being device related as the fibre speed. I have 15 "hops" to battlenet west, I have likely gone via 100 layer 2/3 device each way on that path.


For Playstation Now, probably not a viable business case. But for certain niche markets, for instance high frequency trading every ms advantage you get, is worth tons of money. So it might se the light of day in those type of settings, because latency is key there.

invent almost anything and im sure you can find a niche :LOL:. give that trading is low throughput that is likely one area where you could just effectively replace the cable/transceivers and be done with it.
 
If am right Gaikai had a different business plan on how server infra was built. They licensed a streaming technology and publisher, EA or other big player, would install libs on a server hotel of they own. Publisher run a time limited games on a webpage(flash or java plugin playing h264 game stream) and let users buy a gaikai or offline Origin installation license. Gaikai was a way of promoting and providing demo versions. OnLive has dedicated server hotels where they operate everything, purchases would alway be just Onlive versions of each game.


Or WebM(VP9, Opus audio) codec as few Sony BraviaTV 2014 and other new Sony media devices support VP9 video codec. Well ok I guess odds are its still h264+aac or mp3 tiled streaming.
VP9: One of the goals is to reduce the bit rate by 50% compared to VP8
Opus audio: particularly suitable for interactive real-time applications over the Internet.

Gaikai already setup server farms around US before Sony bought them. I don't know how money flow behind the scene but the servers should be hosted in these centers as close to their target users as possible.

Read somewhere that OnLive had a small server infrastructure at that time.
 
Back
Top