Stadia, Google Game Streaming platform [2019-2021]

Relevant snippets from The Verge review:

With Destiny 2, it’s even more obvious that the game isn’t running at the highest settings. On a Chromecast Ultra, a “4K” stream looked closer to 1080p, and my colleague Tom Warren and I swore that the 1080p streams we were getting in the Chrome web browser looked more like 720p.

Initially, Google told us that it was using the highest-resolution, highest-fidelity build of Destiny 2 available. But Bungie later confirmed that our eyes weren’t deceiving us. “When streaming at 4K, we render at a native 1080p and then upsample and apply a variety of techniques to increase the overall quality of effect,” a Bungie rep said, adding that D2 runs at the PC equivalent of medium settings. That explains why the Xbox One X build, which runs at a native 4K and with higher-res assets, looks so much better than Stadia.

...

I can’t truly tell you whether Google Stadia will work for you with as much fidelity as you see above because I live in Silicon Valley, a mere 45-minute drive from Google’s headquarters, with a fairly good 150 Mbps Comcast internet connection and an excellent Wi-Fi router at my disposal. I’m likely close enough to the company’s West Coast data centers that I’m probably akin to a best-case Stadia user.​
Oh, this is disappointing, even if it is early days.

Have they done any marketing, I'm not including to the small corner of the enthusiasts that already knew about it.

Maybe the actual launch will be next year
 
games like just dance and guitar hero will work well after calibration!
You just need to hit the screen when you see/hear the tune and it should be able to determine the latency difference for scoring.
The games would have to be written to account for that though, which means a fair degree of rewrite rather than a simple port to Linux. That is, the game will expect a perfect hit at time t. It'll receive that hit at time t+x ms based on the round trip time where x could be anything from 40 ms to 200 ms as that'll vary from packet to packet, and it needs to know which delays are caused by the user and which by the network lag.

At first glance, it does not look to me like an easy problem to solve without client-side processing.
 
Playing Stadia for one minute at 1080p uses over 100MB
Google previously informed users that an hour of Stadia could use between 4.5GB and 20GB of data. VentureBeat got their hands on Stadia and decided to test this out for themselves. Their findings not only confirms this, but suggests it could perhaps even exceed 20GB.

Playing 13 minutes of Red Dead Redemption 2 on Google Stadia at 1080p60, used 1.55GB of data, which translates to 119MB per minute, or 7.14GB per hour. Google promises upto 4k60 on Stadia, which is 4 times the resolution of 1080p60, meaning that in theory, Stadia could use almost 30GB per hour.
...
VentureBeat then looked at the average time to beat the game (47 hours), to estimate that playing the entirety of Red Dead Redemption 2 on Stadia could use up to 335GB of data – and that’s just for 1080p.

While data caps are becoming rarer these days, it’s still prevalent in many cities and countries. Even for those with unlimited data, the internet providers often introduce soft data caps, severely limiting speeds past a certain usage.
https://www.kitguru.net/tech-news/m...adia-for-one-minute-at-1080p-uses-over-100mb/
 
Last edited:
https://help.netflix.com/en/node/87

Watching TV shows or movies on Netflix uses about 1 GB of data per hour for each stream of standard definition video, and up to 3 GB per hour for each stream of HD video.

Netflix offers four data usage settings to choose from:
  • Low - 0.3 GB per hour per device
  • Medium - SD: 0.7 GB per hour per device
  • High - Best video quality, up to 3 GB per hour per device for HD, and 7 GB per hour per device for Ultra HD
  • Auto - Adjusts automatically to deliver the highest possible quality, based on your current internet connection speed
 
Exactly. Rhythm games have latency compensation that is a specific amount to cater for TV/AVR latency, not variable latency over packets.

Isn't this part of Google's tech? I thought it was which is why the launch lineup is small, i.e. it requires some fundamental re-work. And also why the service quality bar for using Stadia is higher than other services, it doesn't expect too much variance.
 
Curious how this compares to Netflix, Hulu, Disney+.
They can get away with lower quality than video games that really need to be pristine, plus compression can be far higher quality as it's not real-time. A video game is using something like 12 gbps data to the screen for 4K 60. HDMI 2 spec goes up to 18 gbps. Let's call it 1 GB per second (8 gbps, 4:2:2 signal compression). That's 3.6 terabytes of data per hour. 36 GB per hour is 1% of that data amount. No wonder the compression is pretty harsh.

I hadn't thought about it like that, but given the simplicity of the numbers, it's obvious quality is going to suck along with bandwidth usage and there's nothing Google can do about it unless they invent some incredible new AI compression system that can imagine missing data perfectly.
 
unless they invent some incredible new AI compression system that can imagine missing data perfectly.
Taking this serious, would this still run on low power devices, or would it be a high end client feature, which defies the whole idea?

Found something about energy consumption: https://www.newscientist.com/articl...at-for-gamers-but-bad-for-energy-consumption/

Electricity demand from gaming in California alone could grow from 5 terawatt hours in 2011 to much as 11 terawatt hours by 2021, the same as Sri Lanka’s entire consumption, due of a combination of cloud gaming and more gaming on PCs, according to a recent report commissioned by the California Energy Commission.

No solid data, but at least we know Google did not use energy savings as a marketing argument.
However, with everyone aiming towards cloud, likely nobody will use it against them.
So we have to keep an eye on it ourselves. *raising index finger* :)
 
If cloud compute can't reduce energy consumption, it should be hit on the head until we have something like Fusion power working. There's only so much green energy we can make and we need to use it efficiently.

As to AI upscaling, I assume it'd need a reasonable degree of power although perhaps some ASIC could do it efficiently, but I don't really know what would be possible or plausible. DLSS is the closest we've got and that a million miles from what would be needed to restore original data to a heavily compressed video stream. Such an ASIC would see a market in video players too though, so it's a huge market worth someone creating a product for if its possible. ie, you could stick a box between the streamer and the TV that upscales the video to better quality, until devices fit the ASIC internally.
 
Curious how this compares to Netflix, Hulu, Disney+.

At 1080p, netflix' bandwidth on a device that supports HEVC is at most 5Mbps IIIRC.
A minute is 60 seconds, so 300Mb = 37.5MB.
So Stadia consumes around 3 times as much bandwidth, but I'm guessing it's 1080p 60FPS whereas netflix is only 30FPS.
So in the end, we're looking at around 30% higher bandwidth per-frame in Stadia than in Netflix.
 
At 1080p, netflix' bandwidth on a device that supports HEVC is at most 5Mbps IIIRC.
A minute is 60 seconds, so 300Mb = 37.5MB.
So Stadia consumes around 3 times as much bandwidth, but I'm guessing it's 1080p 60FPS whereas netflix is only 30FPS.
So in the end, we're looking at around 30% higher bandwidth per-frame in Stadia than in Netflix.
If cloud compute can't reduce energy consumption, it should be hit on the head until we have something like Fusion power working. There's only so much green energy we can make and we need to use it efficiently.

As to AI upscaling, I assume it'd need a reasonable degree of power although perhaps some ASIC could do it efficiently, but I don't really know what would be possible or plausible. DLSS is the closest we've got and that a million miles from what would be needed to restore original data to a heavily compressed video stream. Such an ASIC would see a market in video players too though, so it's a huge market worth someone creating a product for if its possible. ie, you could stick a box between the streamer and the TV that upscales the video to better quality, until devices fit the ASIC internally.
https://help.netflix.com/en/node/87

Watching TV shows or movies on Netflix uses about 1 GB of data per hour for each stream of standard definition video, and up to 3 GB per hour for each stream of HD video.

Netflix offers four data usage settings to choose from:
  • Low - 0.3 GB per hour per device
  • Medium - SD: 0.7 GB per hour per device
  • High - Best video quality, up to 3 GB per hour per device for HD, and 7 GB per hour per device for Ultra HD
  • Auto - Adjusts automatically to deliver the highest possible quality, based on your current internet connection speed


Okay, maybe the better question is, how does Stadia compare to PS Now? Since PS Now currently doesn't do PS4 games, I'm guess they have a lower resolution.

Also one thing Google can do if possible is to resolution scale for mobile screens much like DS does.
 
All PS4 and PS2 games can be downloaded.

PSNow is built from PS3 and base PS4 hardware in server racks. xCloud uses base Xbox One consoles [mentioned at initial reveal].
 
I hadn't thought about it like that, but given the simplicity of the numbers, it's obvious quality is going to suck along with bandwidth usage and there's nothing Google can do about it unless they invent some incredible new AI compression system that can imagine missing data perfectly.

I think this is actually one of the major areas of interest for Sony and Microsoft when they signed that Memorandum of Understanding about working together in streaming. A combined effort to create a real-time compression scheme optimized for game content makes a lot of sense to me.
 
After seeing Richard speculate in the Digital Foundry video of Stadia that the platform has 8 CPU cores I finally looked up the figures and actually did the math. Google states each Stadia instance has 9.5MBs of L2+L3 cache. The only way that works out is if they are using 4 Skylake or Cascade Lake cores which have 1MB/core of L2 and 1.375MB/core of L3. Multiply that by 4 and you get exactly 9.5MBs.
 
https://www.eurogamer.net/articles/2019-11-21-stadia-is-in-desperate-need-of-cross-play

Stadia is in desperate need of cross-play Matchmaking...

Google's video game streaming service launched on Tuesday night with 22 games, and I've spent the past few days trying out competitive multiplayer with those that have it.

I had wondered whether it would be feasible to play these parts of the games, given what from the outside looking in seems very much like a soft launch. I've found that, depending on what time of day you're playing, significant parts of some Stadia titles may as well not exist, simply because there are not enough people for matchmaking.

...

And so, like with Destiny 2, Mortal Kombat 11 on Stadia is, for me anyway, a no-go. It feels like a waste of time.

I have so far failed to get an online match in Samurai Shodown - and that's another free game with the Stadia Pro subscription.

...

Now, it is worth noting that it's early days for Stadia, and I expect more people will come online in the coming days and over its first weekend. But the fact remains this would not be happening if we were talking about the launch of the PlayStation 5 and next Xbox. When those consoles release, from the second they are live their launch lineups will light up with players.
 
Back
Top