Stadia, Google Game Streaming platform [2019-2021]

Where do they mention VRAM ? They only say RAM on the slide.
I'd say there is 8GB of HBM2 for the GPU and 8GB of DDR4 for the CPU.

ArsTechnica blurb used VRAM but it wasnt the actual slide or a typical stats listing, just a paragraph around it so maybe its their interpretation of it.

After the event, Google provided a fact sheet to Ars Technica confirming more stats about the hardware included in the Google Stadia stacks. These include custom-built AMD GPUS with 56 compute units and integrated HBM2 memory; "custom, hyperthreaded x86" CPUs (no manufacturer listed) that run at 2.7GHz "with AVX2 SIMD"; and "a total of 16GB combined VRAM and system RAM clocked at "up to 484 GB/s."
 
ArsTechnica blurb used VRAM but it wasnt the actual slide or a typical stats listing, just a paragraph around it so maybe its their interpretation of it.
Where Stadia using AMD and next gen consoles using AMD leaves Nvidias future?. Streaming and back compatibility will make changing GPU vendor not feasible in the future.
 
I'm skeptical on how this can be a profitable business. Those specs are pretty impressive, but doesn't that mean that there is around $1000 or more worth of hardware to service every user currently playing? Will they be able to sell this service for more than $20 per month? I doubt it, and that means you need subscribers to be a customer for a very long time just to recoup the cost involved with the hardware. Will you buy games or will this be more like Netflix? Its impressive that the tech has evolved to the point where this is viable, but that doesnt mean there is a good business model that can turn this into a profit center. Yes, its possible for Google to take massive losses for a long time trying to force there way into the market, and even then, how do you turn this into a profitable business?
Wouldn't the hardware be shared across multiple games depending on client requirements? If 3 users are trying to play a fairly simple game at 1080p, wouldn't they be all running on the same hardware? I assumed the GPU specs would be the maximum for a single instance. I thought the concept of a data center-based GaaS would be shared resources like other data center virtualized instances.
 
Last edited:
& ported to Vulkan
DX12 to Vulkan should be straightforward, however it's possible they are virtualizing the clients like the recent ChromeOS systems. In that case, a XBox/Windows VM may avoid the need to port the code.

Longer term, developing all games to run within a Linux (or comparable) VM would be ideal.

I'm skeptical on how this can be a profitable business. Those specs are pretty impressive, but doesn't that mean that there is around $1000 or more worth of hardware to service every user currently playing?
Across timezones as well so the hardware could run 24/7 or be repurposed for compute tasks during off hours. Throw mobiles into that mix and there is a huge potential market.

"a total of 16GB combined VRAM and system RAM clocked at "up to 484 GB/s."
Curious if that setup is an APU similar to KabyG or a portioned out Epyc with many accelerators. The 8GB per CPU seems a bit weak if not sharing data between instances.
 
Are the GPU

Wouldn't the hardware be shared across multiple games depending on client requirements? If 3 users are trying to play a fairly simple game at 1080p, wouldn't they be all running on the same hardware? I assumed the GPU specs would be the maximum for a single instance. I thought the concept of a data center-based GaaS would be shared resources like other data center virtualized instances.

They claim it's 10.7 Tflops per instance. The resolution and settings of Project Stream didn't look like they were using anywhere near that much power, so they could definitely be profiling GPU use per game and dividing up GPUs with AMD's virtual GPU tech for less per user where they want. I think they could have run as many as 8 instances of AC:O at 1080p on a single one of those dual VEGA cards. I think the CPUs are almost certainly shared as well, with each getting either 3 or 6 cores virtualized on a manycore Xeon.
 
I'm skeptical on how this can be a profitable business. Those specs are pretty impressive, but doesn't that mean that there is around $1000 or more worth of hardware to service every user currently playing? Will they be able to sell this service for more than $20 per month?
Not every customer uses every server all the time. If the average gamer games 6 hours a week, a 24/7 server would hypothetically be able to serve 24*7/6 = 28 users each week. At $20 each per week, that'd be $560 per week running on a $1000 server. Obviously the real rate will be far lower as within a region, gaming will be popular at more times than others, but you can clearly see that it's nothing like one $20 sub per server to cover expenses. Probably more like ten users per server, so $200 per month, hardware paid off in half a year, lots of profit after that.
 
They claim it's 10.7 Tflops per instance. The resolution and settings of Project Stream didn't look like they were using anywhere near that much power, so they could definitely be profiling GPU use per game and dividing up GPUs with AMD's virtual GPU tech for less per user where they want. I think they could have run as many as 8 instances of AC:O at 1080p on a single one of those dual VEGA cards. I think the CPUs are almost certainly shared as well, with each getting either 3 or 6 cores virtualized on a manycore Xeon.

In today's presentation, they said how native 4K 60 FPS signal is sent to their YouTube video servers (for stream watching later) so they're not running at 1080p. So maybe only 2 instances of AC:O instead of 8.
 
Where Stadia using AMD and next gen consoles using AMD leaves Nvidias future?. Streaming and back compatibility will make changing GPU vendor not feasible in the future.

Wait for Amazon, or Netflix to enter into the foray of game streaming services. Or simply concentrate [more] on their GeForce NOW Cloud Gaming services.
 
aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9GLzUvODI4Njg5L29yaWdpbmFsLzEyMzQzMjQxLkpQRw==

So the sh** talking has already started. Google console warriors in the waiting... somewhere.

But seriously, fun times ahead... :yep2:


Why do they even need PS4P or XOX specs if they're going to do all the processing in the cloud and stream the results?

Are they going to have some hybrid model where you can download and run locally?
 
Why do they even need PS4P or XOX specs if they're going to do all the processing in the cloud and stream the results?

Are they going to have some hybrid model where you can download and run locally?
No, this was simply PR bullcrap to show off.
 
Last edited:
Marketing. Just so they can say look at how powerful and how much better the pretties will be with our service because moar power.

Of course no mention of latency or bitrate. Steam in home streaming does about 35mbps with 1080p60fps and this looks good and input lag feels low on my 42" tv though even then I do feel and see a slight difference compared to hooking up my pc directly to the tv.

I don't think google is going to output 70+mbps streams and latency, well, is there any reason to think this isn't going to be noticeable unless you happen to live right next to Google's DC?
 
View attachment 2933

Could someone explain me how this data makes sense? (Source: Digital Foundry article on Stadia)

1) Why there is not display lag on stadia
2) Why PC lag on 60fps is 33ms less instead of 16ms
3) Why the lag is so big on PC and consoles
it's the new generation of gamers, the laggers

They lag but dont have input lag. They are also called the searchers, stadiers or yetiers
 
In today's presentation, they said how native 4K 60 FPS signal is sent to their YouTube video servers (for stream watching later) so they're not running at 1080p. So maybe only 2 instances of AC:O instead of 8.
I will play it at 240p
 
I can't see Google investing significant enough money to make enough of an impact before they inevitably kill the initiative. They have a long history of canning projects that don't break out in the first 18-24 months, many don't last that long. Then there are projects like Glass which go quiet for years.

Whatever it is I can't see it being aimed at the type of person who is buying a mainstream console because Google's business is collecting information in order to serve more personalised ads and mainstreaming games is not look well suited to that.
This is sadly the case, Google understands hardware support about as well as Facebook understands ethics. Irrespective of what Sony and Microsoft do, I'm not going down the Google made hardware road again any time soon.
 
Last edited:
View attachment 2933

Could someone explain me how this data makes sense? (Source: Digital Foundry article on Stadia)

1) Why there is not display lag on stadia
2) Why PC lag on 60fps is 33ms less instead of 16ms
3) Why the lag is so big on PC and consoles

Because reality does not work the way many think it does.

60fps game requires a frame to be rendered in 16ms. Cool. Player pushes button = chain of events starts. GPU finishes the frame it is rendering [up to 16ms]. GPU renders the frame with new button press [16ms]. That frame is placed into a framebuffer, where it waits for the display to take and show the previous frame. The frame is moved from the framebuffer to the screen when display requests it [to fit the native refresh rate of the screen].

From the button press to that being seen on screen, a lot of time can pass.

And there could be few more things that can introduce even more delays [like triple buffering]. 30fps RDR2 has over 200ms of input delay, while 30fps Driveclub has veeeery low lag.
 
this is under ideal conditions

zK3TMeP.gif
0rhYR5h.gif
Some of this feels generational. My kid games on phones and tablets with screen controls and no issues, I hate that experience and won't tolerate it.

We'll probably have a similar disconnect when I'm stanning over a V8 motor guzzling down gas and rumbling under the hood and she's wondering what the fuss is about considering her electric beats it.

Tldr; kids aren't human
 
Some of this feels generational. My kid games on phones and tablets with screen controls and no issues, I hate that experience and won't tolerate it.

We'll probably have a similar disconnect when I'm stanning over a V8 motor guzzling down gas and rumbling under the hood and she's wondering what the fuss is about considering her electric beats it.

Tldr; kids aren't human
if games play like taht, this owuld be too much even for a kid. I mean..., the guy ends up disconnecting the controller out of desperation, which is pathetic.

what do you mean by stanning?

now that you mention it, kids are born with youtube. Youtube gaming videos have very detailed stats and info about the games included in a video, a la Wikipedia, which is great.

Imagina that you are watching a trailer or a streamer, you just click a button and you are in the game. Many people have youtube accounts already...

The streaming services are the omst important rivals of consoles. PC gaming wont be affected, 'cos when you buy a PC, productivity aside, you know you get it to have the best experience, which streaming cant provide
 
The cloud-only model opens the door to removing the anti-cheat slow-downs that plague normal multiplayer games. And also DRM slow-downs, but that's another matter.

Distributed client-server computing in the gaming world is steered by the untrustworthiness of the clients. Most clients are in fact a local very dumb client, generating the local video stream and collecting the user input. Smoke and mirrors helps keep the perceived latency in check (mouselook can be applied locally). But the server still needs to check and re-check all client input and claimed state; no, his inventory didn't suddenly increase by 100 gold for no good reason, or that biplane did not accelerate to mach 2 as claimed.

A trusted simulation environment enables the client to be a more active participant in the shared world simulation and reduces the burden on the central servers. It might enable increased player counts or reduce the server workload when running similar worlds.

Local latency (under 5ms) might make server-side interpolation redundant, further reducing the server load. Still, people might want to play in a far away server that it's offering their favorite setup (e.g. "metro 24/7" is only available in Sweden, not in France) making local latency an optional optimization.

Unique cloud games might appear, who knows.

All those points apply to other cloud served gaming providers, not just stadia.
 
if games play like taht, this owuld be too much even for a kid. I mean..., the guy ends up disconnecting the controller out of desperation, which is pathetic.

what do you mean by stanning?

now that you mention it, kids are born with youtube. Youtube gaming videos have very detailed stats and info about the games included in a video, a la Wikipedia, which is great.

Imagina that you are watching a trailer or a streamer, you just click a button and you are in the game. Many people have youtube accounts already...

The streaming services are the omst important rivals of consoles. PC gaming wont be affected, 'cos when you buy a PC, productivity aside, you know you get it to have the best experience, which streaming cant provide
Stanning means to be obsessed with something.

Regarding kids, they tolerate a lot more than I would regarding gaming. Ads, micro transactions, poor controls, shallow game play, kids endure it. I've seen kids pick up the phone before a 3DS or Vita which just amazes me.
 
Back
Top