Game Streaming Platforms and Technology (xCloud, PSNow, GeforceNow, Luna) (Rip: Stadia)

That all depends on the ISP and things like latency and dropped packets and bad routing or lag. I thought it was mostly a nonissue but then I've been reading through DSLReport forums.

The latency of the game loop: A.) Accept input, B.) update game world, C.) render and D.) output/display, will be impacted by the following:
1. Accept input, add network latency from your controller to the server over the internet
2. Encode the frame when rendered takes time, adds latency
3. Transmit encoded frame to client, add network latency (and transmit time of non-trivial amount of frame bits)
4. Decoding the frame takes time and adds latency

The latency of no 1. and 3. in the above is directly impacted by network latency, no. 3 is also impacted by the bandwidth of your internet connection.

If you're on cable or fibre, you can probably get the aggregate latency of 1. and 3. down to 25-40 ms. DSL will be 40-55ms. If you're on a 4G internet connection, you will probably see 80+ms of latency. The latency of encoding depends on how much hardware is thrown at it and how it is pipelined/diced in parallel jobs. A lower bound is probably around 16ms (for 60Hz throughput). Decoding a frame should be fast with 16ms being an upper bound.

That amounts to an additional best case game loop latency of to 55-100ms. Worst case is..... worse.

Cheers
 
They were at the Game Developer's Conference talking to Game Developers. How could they talk about their service to potential content creators without giving them an idea of the level of performance they'd be able to work with? The TF figure was pretty necessary as part of the announcement, and not a nice little marketing aside to tempt Joe Gamer.
That’s a fair point, and maybe OnLive and others’ failures is partially rooted in not involving developers more. I think my point about TF being the new MHz/core wars talking point is still valid though.
 
Visual acuity is also a no-go for streaming games. I can play pixel hunting bushes in WarThunder tank battles, but I sure can't tell bush from tank in YouTubes. Features under 3-4 pixels can be hard to tell apart if they're similar in nature after they're mangled by YouTube's "red channel? never heard'o him" compression.
 
They were at the Game Developer's Conference talking to Game Developers. How could they talk about their service to potential content creators without giving them an idea of the level of performance they'd be able to work with? The TF figure was pretty necessary as part of the announcement, and not a nice little marketing aside to tempt Joe Gamer.

But how did Joe Gamer know about Stadia (now)? Because of the GDC announcement! There was an U.K. article showing a high percentage (32-37%, actually eclipsing Xbox's next system) of surveyed folks whom were interested in Stadia because of the GDC event. Joe Gamer isn't necessarily living in a bubble these days.

Though in the long-run the average casual gamer may-not care for spec's or console performance metrics like teraflops throughput or being ray-tracing capable, but they do 100% care about is that the next-generation of systems offer them a better-enough experience than the prior generation and that their money is well spent on the hardware that's capable of making them move onward.
 
Last edited:
This could actually move to the publishers though. EA could run their own cloud servers for games. Basically, every game now that's networked and slow paced could just be run from servers anyway. Add a streaming protocol to the servers and you could have local agents connect, rendering locally, and streamed agents connect getting a video feed. In that world, Google's at no great advantage. These servers could choose to run on Azure or Sonynet or Amazon Cloud or Google Cloud or Apple Cloud, and these services would have to compete with each other for economy to operate these games.

Low latency streaming services need specific hardware and setup, so can't be run just anywhere, but those games are going to be moderately sucky on cloud versus console/PC.
Let's hope cloud latency doesn't begin to influence game design.
 
But how did Joe Gamer know about Stadia (now)? Because of the GDC announcement! There was an U.K. article showing a high percentage (32-37%, actually eclipsing Xbox's next system) of surveyed folks whom were interested in Stadia because of the GDC event.
Had there been no teraflop figure, Joe Gamer would have still heard about it and been interested. They aren't interested simply because it's 10 teraflops, and that figure wasn't meant for them. Google said, "look how many teraflops XB1 and PS4 have and what you can do with that. We offer 10 TFs so you can do more." And devs thought, "okay, given where we're wanting to take our game, and what we're expecting of next-gen, 10 TFs should be a good fit."

If Google hadn't put up any figure, devs would have been sat there thinking, "is this a next-gen platform, or current gen? Is there any point thinking about moving our new games to it? It's enough faff porting to Linux; are we expected to take our 8+ TF target platform and squeeze that into a...what, 3, 4 TF cloud budget?" While Joe Gamer would have still thought, "ooo, games on my TV through my mobile and stuff. Sounds interesting."

When Joe Gamer will be interested in TFs, the subsection who talk semiknowingly about console hardware to their friends and peers, will be the next hardware announcements and recommending machines. An 8 TF PS5 vs a 10 TF XB2 will be compared as they know the XB2 will be better graphics and potentially more responsive (better framerate stability). Whatever the cloud does won't matter. That's a subset of gaming, so even if photorealistic, it'll be streamed games for what the cloud brings, latency and compression artefacts and all, and console games with as many TFs as possible for local responsive, pretty games.
 
Had there been no teraflop figure, Joe Gamer would have still heard about it and been interested. They aren't interested simply because it's 10 teraflops, and that figure wasn't meant for them. Google said, "look how many teraflops XB1 and PS4 have and what you can do with that. We offer 10 TFs so you can do more." And devs thought, "okay, given where we're wanting to take our game, and what we're expecting of next-gen, 10 TFs should be a good fit."

If Google hadn't put up any figure, devs would have been sat there thinking, "is this a next-gen platform, or current gen? Is there any point thinking about moving our new games to it? It's enough faff porting to Linux; are we expected to take our 8+ TF target platform and squeeze that into a...what, 3, 4 TF cloud budget?" While Joe Gamer would have still thought, "ooo, games on my TV through my mobile and stuff. Sounds interesting."

When Joe Gamer will be interested in TFs, the subsection who talk semiknowingly about console hardware to their friends and peers, will be the next hardware announcements and recommending machines. An 8 TF PS5 vs a 10 TF XB2 will be compared as they know the XB2 will be better graphics and potentially more responsive (better framerate stability). Whatever the cloud does won't matter. That's a subset of gaming, so even if photorealistic, it'll be streamed games for what the cloud brings, latency and compression artefacts and all, and console games with as many TFs as possible for local responsive, pretty games.

I was answering your comment about Joe Gamer being oblivious of GDC, not TF performance. Hence my comment that followed.

Though in the long-run the average casual gamer may-not care for spec's or console performance metrics like teraflops throughput or being ray-tracing capable, but they do 100% care about is that the next-generation of systems offer them a better-enough experience than the prior generation and that their money is well spent on the hardware that's capable of making them move onward.
 
They are two different products and the TF figures don't matter. A puny 8 TF console will still be the fastest console ever made.

This. Stadia won't have much of an impact in Sony designing their next console.

It doesn’t matter what the point is, it’s relevant to marketing regardless.

The average Joe won't care how many TF's the next PlayStation or Xbox will have. Only tech nerds will be in panic when said Stadia will offer more TF then PS5?
 
IMO very few people will care about Stadia or any other pure streaming service because the experience will be terrible for a lot of people.

So Google could have 100 TFLOPs worth of GPU power to every user and it still wouldn't be the preferred choice for playing games, by far.

AFAIK almost all video codecs can't do more than 60Mbps HEVC and even if the codec is super efficient it still won't hold a candle to the 12500Mbit/s the HDMI cable provides, and then there's all the (in many places unpredictable) latency you'll get almost everywhere.

Maybe in 10-15 years some developed countries will be using much faster infrastructures that actually make the latency decent for real-time gaming, but nowadays it's just not a good experience.

I've been using Steam In-Home Streaming for a while, and the latency there using a fairly recent router makes it impossible for some games. And the image quality with "unlimited bandwidth" is nice enough for my 12" Surface Pro, but if I stream from my office to my 55" 4K TV through a HTPC, the quality is pretty bad. I prefer to render locally on the HTPC with a RX480 using lower settings/resolution, than to stream from my office maxed out with a Vega 64.


So in the end, nah.. the next-gen makers probably aren't worried about the 10.7TFLOPs from Stadia.
Stadia customers are probably people who don't even use consoles. Besides, it being limited to Linux/Vulkan will make the game choice be very scarce. Just look at Steam OS.
 
In a perfect connection, speed of light is <200 miles per millisecond, so 10 ms would allow for round-trip communication between a player and datacentre 1000 miles apart. With each hop along the way though, there must be a number of ms latency added. I doubt we could get lower than 50ms added latency, on top of display latency, any time soon. OnLive managed 150 ms? PSNow was a 60+ms overhead when DF tested it, taking 100 ms latency local games up to 160+ ms.
https://www.eurogamer.net/articles/...laystation-nows-ps4-game-performance-analysed

Image1.png
A game that's okay with 200 ms of lag will be fine. That's actually where more GPU would be better. Something like a Quantic Dream game streamed as if a movie, fancily raytraced in the cloud, would be a good fit.
 
Ubisoft porting AC:O to Vulkan - Stadia.

Seems as though stadia encoding is 60fps its AC:O that was 30fps. Reason seems to be that console version was 30 so didn't want to risk / spend time getting it to run at 60.

PC ultra settings

Tools to convert from DX12 as expected sound pretty immature at the time.
 
IMO very few people will care about Stadia or any other pure streaming service because the experience will be terrible for a lot of people.

hm... Have they discussed launch regions/areas?

Ubisoft porting AC:O to Vulkan - Stadia.
...
Tools to convert from DX12 as expected sound pretty immature at the time.

Kinda off-topic, but I wonder if MS is seeing a threat from developers (behind the scenes) to simply go with Vulkan since there are more platforms.

We've just seen MS support Blizzard on DX12 for Win7: https://devblogs.microsoft.com/directx/world-of-warcraft-uses-directx-12-running-on-windows-7/

ahem.

/AlFoil hat
 
Last edited:
Seems as though stadia encoding is 60fps its AC:O that was 30fps. Reason seems to be that console version was 30 so didn't want to risk / spend time getting it to run at 60.
Surely it'd be a doddle to hit 60 fps, just by changing the settings. Is AC:O incapable of running > 30fps on Windows?
 
Surely it'd be a doddle to hit 60 fps, just by changing the settings. Is AC:O incapable of running > 30fps on Windows?
https://www.techpowerup.com/reviews/Performance_Analysis/Assassins_Creed_Odyssey/4.html

Symptom of a crazy complex engine largely built around a history of only needing to target 30fps?

hm... yeah, dunno about what options they have on PC.

edit:
https://segmentnext.com/2018/10/02/assassins-creed-oddessey-tweaks/

https://gearnuke.com/assassins-creed-odyssey-pc-settings-guide-removing-stuttering-60fps/

:s
 
  • Like
Reactions: Jay
Kinda off-topic, but I wonder if MS is seeing a threat from developers (behind the scenes) to simply go with Vulkan since there are more platforms.
Not just you, I was thinking about this.
Could even see a point where MS start to support Vulkan.

Surely it'd be a doddle to hit 60 fps, just by changing the settings. Is AC:O incapable of running > 30fps on Windows?
Have to admit this is why I assumed it was the encoder. I'm pretty sure Google also mentioned that they were working on it also, so I just assumed it was all encoder related.
 
hm... Have they discussed launch regions/areas?

Very limited launch regions available according to this map taken today 2019-03-29 @ 10:00 AM EST.


upload_2019-3-29_10-0-42.png

Compare that to MS Azure to put things into perspective (* Two Azure Government Secret region locations undisclosed) :

upload_2019-3-29_10-3-9.png
 
Compare that to MS Azure to put things into perspective
That's not the same as xCloud though.

"Scaling and building out Project xCloud is a multi-year journey for us. We’ll begin public trials in 2019 so we can learn and scale with different volumes and locations." - https://blogs.microsoft.com/blog/2018/10/08/project-xcloud-gaming-with-you-at-the-center/

And indeed, we don't know where Stadia will be available as it isn't just running on Google cloud services but on specific hardware (10 TF GPUs). What's the map for actual availability for xCloud, Stadia, PSNow, nVidia's thing, OnLive (still going?)?
 
Not sure, havent seen map available for any. We do have an absolute upper bounds on where Stadia or xCloud could be today.
 
Back
Top