D
Deleted member 11852
Guest
OnLive has now shut down and the patents were sold to Sony report Ars Technica.
Welp, I was really wrong about this...
The finances are hard.Welp, I was really wrong about this...
We looked at Azure about 18 months back and the VMs had rudimentary graphical capabilities. That may well have changed but I've not read anything about Azure expanding from from traditional compute (CPU) to graphics.Is Azure going to be a suitable system for game streaming? The obvious advantage is it already exists and is big, but I'm not sure if it's designed for/can handle low latency. And will they hit a patent wall?
OnLive has now shut down and the patents were sold to Sony report Ars Technica.
Sony is acquiring important parts of OnLive, and their plans don't include a continuation of the Desktop service.
We looked at Azure about 18 months back and the VMs had rudimentary graphical capabilities. That may well have changed but I've not read anything about Azure expanding from from traditional compute (CPU) to graphics.
edit: obvious Azure is a commercial service, they'll go where the the market dictates because you can be sure Amazon will
Azure as a platform is more than capable of streaming data but the data you want stream is rendered graphics and you need to render those graphics using hardware (a modern GPU) or a software renderer. This is the big unknown with Azure. When we looked at Azure the focus was on general compute (CPUs, not GPUs) and the graphics capabilities of individual VMs were rudimentary. Microsoft may now have a software renderer but mimicking GPU functions using a CPU would seem an inefficient use of resources but if they've got a lot of overhead then it's likely cheaper than deploying, maintaining and upgrading GPU hardware.There is Thunderhead and it's hardware is an unknown at this point. But given the leaked MS data about its future plans, low latency streaming is something MS has been working on for a long time.
I can see its uses for mobile gaming, or unifying your PS, Xbox, etc experiences, where you can play a game in the train, metro, whatever... but other than that I don't like that technology at all. I am the biggest advocate of physical hardware, because using a service like that I feel like I am playing a video...Just tried my first taste of game stremaing, and I'm impressed! Running nVidia's Grid on a Shield Tablet connected to TV over HDMI with a DS3 attached. I fired up Pixel Junk Shooter and Borderlands 2 as games I've played on PS3. PJS lacked the smoothness of 60 fps, and was a bit laggy, but definitely playable. B2 looked better thanon PS3, much better IQ, and the compression artefacts were pretty minimal. Also have the option to render in 720p, 900p or 1080p for better framerate/resolution trade-offs. Again, some lag, but TBH I can't say it felt noticeably worse than the PS3 version. That's on a stable, low latency 13 Mbps connection.
If the internet develops to reduce latency, this seems a very feasible alternative indeed. It removes the need for different builds as you only need one game to run on the cloud. The only significant advantage I see for consoles is for fast paced, low latency games, plus maybe 60 fps, although better internet also solves that issue. I suppose pricing of the subscription is a concern as the cost to run can't be cheap. But then if the hardware needed to run it is cheap, that defers costs. What if PlayStation becomes PSNow, run on any device, even cheapo Android TV boxes? PS5 then doesn't cost $400+ to join plus $60 (?)pa. subscription + $70 a game, but $40 per month or whatever. Could be better value to subscribe. All depends on the price.
I can see its uses for mobile gaming, or unifying your PS, Xbox, etc experiences, where you can play a game in the train, metro, whatever... but other than that I don't like that technology at all. I am the biggest advocate of physical hardware, because using a service like that I feel like I am playing a video...[/quote[I dare say that's psychosomatic. In a blind test, apart from compression artefacts, I don't think you'd respond any differently to streamed games versus native.
It's not so good for low latency games, but multiplayer is typically high latency and lots of cover-ups in the net code. Round trip latency on nVidia Grid is 150 ms including time to display, comparable with plenty of action games on console. Network packets can hit bumps of 300+ ms and that's just disguised by the game so you don't notice. Actual latency on input is no different on Grid, and latency on output is possibly not much different. It's probably no good for the hard-core twitch gamer with their 144 Hz displays and 3000 dpi mice, but I think it's perfectly adequate for the typical console gamer, especially factoring in latency improvements over time.Another issue I see with that technology is...video artefacts aside... competitive multiplayer gaming where the video is downloading and you need low latency to play online.
The problem I think is that depending on a internet connection is a huge deal breaker for me, another inviolable superb huge deal breaker is having a video decoder instead of a console/PC to play games. I'll never stand playing games like that.I dare say that's psychosomatic. In a blind test, apart from compression artefacts, I don't think you'd respond any differently to streamed games versus native.
In games like Diablo 3, where latency can be easily hidden, that's quite understandable, and maybe you could play a co-op game somehow smoothly.It's not so good for low latency games, but multiplayer is typically high latency and lots of cover-ups in the net code. Round trip latency on nVidia Grid is 150 ms including time to display, comparable with plenty of action games on console. Network packets can hit bumps of 300+ ms and that's just disguised by the game so you don't notice. Actual latency on input is no different on Grid, and latency on output is possibly not much different. It's probably no good for the hard-core twitch gamer with their 144 Hz displays and 3000 dpi mice, but I think it's perfectly adequate for the typical console gamer, especially factoring in latency improvements over time.
Do you stream videos?The problem I think is that depending on a internet connection is a huge deal breaker for me, another inviolable superb huge deal breaker is having a video decoder instead of a console/PC to play games. I'll never stand playing games like that.
I don't care any more than I care about having a box to play DVDs instead of steaming videos from the net. I just want the content/experience, and don't care for the hardware. The days of interesting hardware are long gone.I wonder what you prefer...
Every networked game hides the latency.In games like Diablo 3, where latency can be easily hidden, that's quite understandable, and maybe you could play a co-op game somehow smoothly.
If they're streaming, none whatsoever. If they're not, the game could certainly compensate on their end. If it doesn't, you might have a higher latency, but then again if your TV is lower latency and theirs is higher, or they've got an inferior connection to you, they can still be disadvantaged. Online gaming is something of an illusion. You'll never have player equality and all the interactions are prone to late instructions and best guesses and predicted outcomes and emergency corrections. Most of these are small enough and handled subtly enough that you don't notice, but it's quite possible that your shot's TCIP message is delayed 150 ms longer than your rivals and the system has to adapt to this.What I wonder though is how it could handle games like Battlefield and its 64 players, and similar games, and how much disadvantageous it would be to a stream video user against other players.
Try it - you may be surprised. It's a real test of your network controllers and router.Local wired intranet should add far less to latency, next to nothing I'd have thought.
If you are streaming a network MP game your setup will be like this:
Input machine -> internet -> streamer machine -> internet -> MP game server -> internet -> streamer machine -> internet -> display machine.
That will lead to more lag (vs the standard setup) unless you combine your streamer machine and MP game server somehow (like Square is trying to do).
OnLive used to do on site dedicated servers to minimize that latency for multiplayer.Huh? If consoles can do peer to peer multiplayer then why can't streaming hardware? Shouldn't things like host advantage be somewhat nullified if all the hardware is locally networked. And cheating and security is less of an issue since users lack direct access to the hardware.
Is a client server setup even necessary?