Microsoft Project xCloud (Game Streaming), now offering Fortnite free without GPU membership

AzBat

Agent of the Bat
Legend
With Game Pass my kids are playing almost all games on the Cloud. Why wait to install games anymore? That's so last century. How quaint? LOL

There have been a few times they liked a game enough to install it, but those are very few. The biggest offenders have been couch co-op games where the Cloud version doesn't support a 2nd local player. 😕

Other than that I'm almost completely sold on Cloud gaming. Just a need a wireless router upgrade. 😁

Tommy McClain
 

iroboto

Daft Funk
Legend
Subscriber
With Game Pass my kids are playing almost all games on the Cloud. Why wait to install games anymore? That's so last century. How quaint? LOL

There have been a few times they liked a game enough to install it, but those are very few. The biggest offenders have been couch co-op games where the Cloud version doesn't support a 2nd local player. 😕

Other than that I'm almost completely sold on Cloud gaming. Just a need a wireless router upgrade. 😁

Tommy McClain
Yea I was going to get into how moving to cloud removes any need for hardware. That also means playing anywhere. Not needing to really figure out reviews on the best hardware configurations or versions. You just play the games as a service.

I think at the very least it will change the total addressable market and time to transition. Typically we don’t get huge console population until there are slim and discount models; this would remove that.
 

cheapchips

Veteran
The KB+M support for xCloud is a really nice friction remover for PC play. There's points where I can't play on the Xbox but choose to play something local on the PC because I can't be arsed to go on the hunt for a controller and usb cable.
 

DSoup

Series Soup
Legend
Subscriber
So perhaps more apt to say; will change game development as we know it today.

That is definitely more apt and I don't know that this would necessarily be a good thing for developers. If the technology platform changes constantly, your are in a never-ending learning cycle. Whilst people are - generally - educated to embrace change and adopt and learn throughout there life, if the changes are significant then change - new technology - can become a barrier, not an advantage.

Source: wrote server code for years, then managed a server farm. We changed hardware when it was necessary, it's not like slotting in a new graphics card in your PC every year. Changing server hardware, the OS, the other software stack and all of the in-flight VMs is a massive, massive task.
 

iroboto

Daft Funk
Legend
Subscriber
That is definitely more apt and I don't know that this would necessarily be a good thing for developers. If the technology platform changes constantly, your are in a never-ending learning cycle. Whilst people are - generally - educated to embrace change and adopt and learn throughout there life, if the changes are significant then change - new technology - can become a barrier, not an advantage.

Source: wrote server code for years, then managed a server farm. We changed hardware when it was necessary, it's not like slotting in a new graphics card in your PC every year. Changing server hardware, the OS, the other software stack and all of the in-flight VMs is a massive, massive task.
There’s no requirement for new generations of hardware to be replaced every year. You’re still on a 5+ year generation and like any development kit they are always on evolution; the project picks one until delivery.

I don’t think it will make things any harder for developers, they no longer have to worry with population numbers when a generation triggers to determine what features they can use.
 

DSoup

Series Soup
Legend
Subscriber
There’s no requirement for new generations of hardware to be replaced every year. You’re still on a 5+ year generation and like any development kit they are always on evolution; the project picks one until delivery.

Do you really think it's commercially viable to change server hardware every year, or even every two years? How are these service cloud providers managing to upgrade all the servers farms near all the gamers on a regular basis? How is that paid for exactly, and what kind of disruption is there to the rest of the farm during upgrades? How long do you envisage this taking?

I don’t think it will make things any harder for developers, they no longer have to worry with population numbers when a generation triggers to determine what features they can use.

Developers do have to worry about if there are servers of the right type and volume capacity in the locality of where gamers want to play. They have to worry about contracting with cloud providers to reserve capacity and resources servers without knowing if there is even anybody near the server who wants to play. They have to pay very different licensing for a bunch of tech for server deployment compared to local machines or consoles. You're problems don't disappear when you centralise functions, you just swap one set of problems for another.
 

DSoup

Series Soup
Legend
Subscriber
Not to mention that if everything is cloud based you only need less than 20% of the hardware that is out there now to deliver the same experience.
This is a server industry fallacy. Whilst, overall, you have less concurrent server hardware running you do have to put servers anywhere and everywhere that your customers may want to play. In practise you often have to pay for and deploy more hardware to have good server availability and coverage because latency is a much bigger issue than for most server functions. E.g. the latency of your average Netflix server would make any game unplayable.
 

iroboto

Daft Funk
Legend
Subscriber
Not to mention that if everything is cloud based you only need less than 20% of the hardware that is out there now to deliver the same experience.
Easier to distribute, easier to ship, no accessories etc. and yes, you can oversubscribe because most people aren’t all playing simultaneously.
Do you really think it's commercially viable to change server hardware every year, or even every two years? How are these service cloud providers managing to upgrade all the servers farms near all the gamers on a regular basis? How is that paid for exactly, and what kind of disruption is there to the rest of the farm during upgrades? How long do you envisage this taking?



Developers do have to worry about if there are servers of the right type and volume capacity in the locality of where gamers want to play. They have to worry about contracting with cloud providers to reserve capacity and resources servers without knowing if there is even anybody near the server who wants to play. They have to pay very different licensing for a bunch of tech for server deployment compared to local machines or consoles. You're problems don't disappear when you centralise functions, you just swap one set of problems for another.
it will still follow the standard generations of 5-7 years, I don’t think that will change. We just don’t need to experience several years of waiting for the population to buy the hardware for developers to think it’s a reasonable time to overhaul the engine for the new features.
 

DSoup

Series Soup
Legend
Subscriber
it will still follow the standard generations of 5-7 years, I don’t think that will change. We just don’t need to experience several years of waiting for the population to buy the hardware for developers to think it’s a reasonable time to overhaul the engine for the new features.
This is not how the economics of the server industry work. You need very specific hardware for running games, it's not just another web/php script serving the web, email, e-commerce or other services, nor streaming existing content - where when hardware isn't used for one particular task, it's easily and readily deployable for something else. Servers for gaming are specific to that task. You could obviously run non-gaming applications but a remote xblade is not going to integrate well into a massive web or e-commerce server arrangement, or a distributed platform designed to solve massive problems.

This why cloud-served games have limited availability geographically. Their design is specific and that same design makes them a poor choice for traditional server demands.
 

iroboto

Daft Funk
Legend
Subscriber
This is not how the economics of the server industry work. You need very specific hardware for running games, it's not just another web/php script serving the web, email, e-commerce or other services, nor streaming existing content - where when hardware isn't used for one particular task, it's easily and readily deployable for something else. Servers for gaming are specific to that task. You could obviously run non-gaming applications but a remote xblade is not going to integrate well into a massive web or e-commerce server arrangement, or a distributed platform designed to solve massive problems.

This why cloud-served games have limited availability geographically. Their design is specific and that same design makes them a poor choice for traditional server demands.
It took 2 years for MS to deploy Series X hardware to 22 countries. And that's approximately 50% of their total supported regions. This is their first foray into the service, I don't consider that a long time when they were also using silicon to drive regular console chips as well (and covid shortages), and clearly not all markets are the same size. MS is definitely prioritizing their largest markets first for both where they have data centers and where they can net the most number of customers.

And I don't think it's necessarily true that non-gaming applications cannot be run on xblade. I do not imply that it can be used for typical cloud compute though. There are a great deal of weaker IoT devices that Xcloud to render graphics and stream them to on-demand (quick spin up and shut down). Ultimately, they would be specific applications serving specific application clients, but saying that outside pure gaming they have no purpose would probably be false. Today they are used for streaming games, but there's no doubt that eventually as the next generation of console hardware arrives and the next set of upgrades arrive, they will need to repurpose these blades as traffic moves to the newer hardware.
 

DSoup

Series Soup
Legend
Subscriber
@irotobo your position was it would change gaming, then gaming development. On the latter point, it will, but as I said, for each problem server-side gaming solves, it introduces another. I'm talking about cost which you're completely ignoring. Server-based solutions are fantastic if you don't have to worry about the cost, which is the most expensive aspect of providing server-based services.

Server economics are predicated on using all of the capacity for your servers all of the time, as in somebody is paying that server to run, along with the people is takes to keep it running. The vast majority of server hardware is very flexible and readily-redeployable in terms of switching VM environments. To be clear, not micro-hypervisors like you'll find in consoles, but server OS VM which have two, sometimes three, layers of virtualisation and allow realtime redeployment within a server cluster. With a few exceptions servers are designed to be adaptable for many common tasks like messaging, web, and e-commerce. But gaming is one of the exceptions where the hardware has a specific purpose and where it doesn't make economical sense to over-engineer the hardware.

So you have a design (and cost) choice You can include gaming hardware into all servers hardware designs, which would mean 99.999% of that hardware would be unused most of the time, or you have specific servers designed for running games. Your choice there is put the required hardware for the performance you want, e.g. the equivalent of six-to-twelve Xboxes onto each server blade. But for that blade to be remotely useful when not running games, and to be able to seamlessly integrate into common server infrastructure, you need all that extra - expensive - server architecture. That ECC-RAM, vastly more cache and interconnects between the individual devices. There are going to be very few, if any, adhoc demands for an ephemeral remote server with the configuration of an enhanced Xbox Series X. The most common servers demands are low-volume but 24/7/365 running or high-volume processing which is very often distributed across multiple VMs on multiple pieces of server hardware all working in unison.

xCloud is available in 28 countries which is one country more than the number of countries in the European Union. This is the biggest problem, it's very expensive to put server hardware everywhere that people may want to game from. Having launched in November 2013, Sony had PS4 in 48 countries. Twice as many countries in two months.

But the time Microsoft cover all of the countries where games over, they're starting over upgrading the earliest installed-servers because upgrading a server farm is a slow, laborious task.
 

iroboto

Daft Funk
Legend
Subscriber
@irotobo your position was it would change gaming, then gaming development. On the latter point, it will, but as I said, for each problem server-side gaming solves, it introduces another. I'm talking about cost which you're completely ignoring. Server-based solutions are fantastic if you don't have to worry about the cost, which is the most expensive aspect of providing server-based services.
I don't disagree with the fact costs are a big part of server based builds. I've been working in a telco for the last 15 years, I know very well the cost of rolling out new wireless technologies, data centers etc. I'm well aware of these factors. But so is MS. They have a spend of near 1B USD per month on data center builds, no one understands the cost of deploying xcloud more than they do. The idea that the xcloud blades need to double duty as non-gaming devices in order to be profitable is unlikely, they never would have launched the product if that were the case. Xcloud was deployed under teh grounds that the hardware servicing itself would be profitable. They did the math. They also put ECC checking into each Series X chip so that they wouldn't need ECC memory, for what reasons is beyond me, but it's present.

Respectfully, I appreciate the discussion around the cost to deploy, but the reality is MS has crunched the revenue and expenses for this project. They have a J curve and they have an expected return date. Really, either the project floats or sinks, there's not really an in-between here for them. I'm assuming it's going to succeed, because frankly, I don't see this going any other way. Somewhere we're going to need a cheap 399-499 console that is at least 8x more powerful than the consoles today in 10-12 years from now within the same size, cooling and power budget? I'm not sure that's possible, but having something that is 8x more power in the cloud, yea that's possible.

PCs are massive beasts with massive GPUs, that are just getting more massive, it's shocking to be honest. But no one cares because, they don't have restrictions, the mainstream space does. I'm not sure if there is a gaming future without cloud gaming. So that's where I think MS is at. And if the total addressable market is good, subscriber counts are good, latency and the rest of it good, why not move everyone to cloud, despite how much it may cost - the revenue would make it worth it.
 

DSoup

Series Soup
Legend
Subscriber
The idea that the xcloud blades need to double duty as non-gaming devices in order to be profitable is unlikely, they never would have launched the product if that were the case. ... ... ...

Respectfully, I appreciate the discussion around the cost to deploy, but the reality is MS has crunched the revenue and expenses for this project. They have a J curve and they have an expected return date. Really, either the project floats or sinks, there's not really an in-between here for them.

And this is the crux. The assumption that a server operation that can achieve that industry standard goal of 97% utilisation. If xblade is not of a design that seamlessly meshes into the rest of the server infrastructure, the only way they don't have idle hardware is if they don't have enough hardware at all, i.e. demand always exceeds supply and you cannot build a service like that.

I'm talking about the operation as it exists economically now. You're talking like there is a plan that somehow bucks server economics, which is well understood. I don't subscribe to any blind faith that because "Microsoft crunched the revenue and expensive for this project" it'll be successful/profitable, because you know what? They also crunched the numbers for a lot of projects and investments that lost them a lot of money and they later canned, including Nokia. For Microsoft xCloud does not look like a significant experiment relative to Azure so it's probably low risk.
 

iroboto

Daft Funk
Legend
Subscriber
And this is the crux. The assumption that a server operation that can achieve that industry standard goal of 97% utilisation. If xblade is not of a design that seamlessly meshes into the rest of the server infrastructure, the only way they don't have idle hardware is if they don't have enough hardware at all, i.e. demand always exceeds supply and you cannot build a service like that.

I'm talking about the operation as it exists economically now. You're talking like there is a plan that somehow bucks server economics, which is well understood. I don't subscribe to any blind faith that because "Microsoft crunched the revenue and expensive for this project" it'll be successful/profitable, because you know what? They also crunched the numbers for a lot of projects and investments that lost them a lot of money and they later canned, including Nokia. For Microsoft xCloud does not look like a significant experiment relative to Azure so it's probably low risk.
That doesn't make any sense to me. Most cloud and telco hardware is heavily oversubscribed. It's how we bring the costs down dramatically at the consumer level, in which the inverse (dedicated equipment) is insanely expensive to a customer that demands it.

Why would they have idle hardware? I'm not sure I understand this concept that you're referring to. Demand surpasses supply that is physically available, under the usual assumption that not everyone wants to access the hardware simultaneously. Oversubscription is precisely why we are able to get our gigabit ethernet speeds for so cheap, and that's also why Azure and GCP, AWS can be so cheap. You only get the big bill when you leave a server running that should have been shut off.

If you have 100 customers that want xcloud, you don't need more than 4 blades. Probably at most 20 of them are playing at any given time, and that would likely be very generous. I don't understand this need to have a single console for every single player all at the same time.
 

DSoup

Series Soup
Legend
Subscriber
That doesn't make any sense to me. Most cloud and telco hardware is heavily oversubscribed. It's how we bring the costs down dramatically at the consumer level, in which the inverse (dedicated equipment) is insanely expensive to a customer that demands it.

I don't know who you work for but - no offence - it sounds like shitty company. If your server infrastructure is over-subscribed, then you're customers are getting a bad service and when there is a sudden surge in demand, a very bad service. If Google, Amazon, Azure or Apple ran servers like that the whole experience of using their services would be crap. I get telecoms companies are a bit of a different beast and they definitely do priorities profits over service but that's not how the big cloud/services operate at all. Nor Government for that matter, and Government is a much easier place to make the economics of severs work because you're not chasing a profit.

Why would they have idle hardware? I'm not sure I understand this concept that you're referring to. Demand surpasses supply that is physically available, under the usual assumption that not everyone wants to access the hardware simultaneously.

If any of Google's services, but particularly search or voice, were oversubscribed and responses were not near-instant, very few people would use then. If you're a telco with limited competition you can risk providing a shit service. What are people going to do? Send letters?

If you have 100 customers that want xcloud, you don't need more than 4 blades. Probably at most 20 of them are playing at any given time, and that would likely be very generous. I don't understand this need to have a single console for every single player all at the same time.

You need blades geographically near every customer. If you have customers that travel a lot, you need blades in servers near all the places they travel. Cloud-based services are hugely popular with travellers. You follow?
 
If you have 100 customers that want xcloud, you don't need more than 4 blades. Probably at most 20 of them are playing at any given time, and that would likely be very generous. I don't understand this need to have a single console for every single player all at the same time.

Dsoup touched on this briefly, but unlike most use cases for distributed server processing, gaming requires the server hardware to be as close to the end user as is physically possible due to latency being far more impactful to the end use that the server hardware is servicing. It's fine if, for example, a search request takes 100 ms to reach a user (the data transmission, not the search calculation) who may not be local to that data center servicing that search request. It's not fine, however, if the result of a player's input takes 100 ms to reach their screen.

This means that you can't amortize downtime in one region by having that hardware process some gaming requests from other regions. That's one of various methods for data centers to mitigate server downtime that would be inappropriate for gaming hardware in a data center.

This means that for gaming loads, you can't oversubscribe to nearly the same degree without it greatly impacting the service being provided. In any given locality, the vast majority of request for service will happen with a relatively small window on weekdays. Weekends will have wider windows than weekdays but it will still be some fraction (likely less than half) of the day. IE - even in the best case scenario (weekends) server hardware that is only being used for gaming is going to be sitting idle for long periods of time.

So, for that case, you have to be able to service peak hour usage, which means a lot of idle hardware during off peak hours. And unlike less latency sensitive server workloads one locality operating at peak load can't leverage idle servers 3+ time zones away without greatly impacting the end user experience.

Thus, the issue is, how do you use all of that gaming server infrastructure when local demand for that hardware (say midnight local time, for example) might be in the single digit percentages WRT the gaming server hardware in that datacenter?

Obviously the inclusion of features that are mostly unnecessary for gaming (like robust ECC memory support) gives some hints in how MS would try to leverage the Xbox Series blades for non-gaming uses. The question is, what uses would be conducive to using that hardware without impacting the core use case (gaming) of that hardware? However, that also means that any non-gaming oriented design features of that server hardware will be "idling" when servicing the core use (gaming) for which that hardware was installed.

Regards,
SB
 

iroboto

Daft Funk
Legend
Subscriber
I don't know who you work for but - no offence - it sounds like shitty company. If your server infrastructure is over-subscribed, then you're customers are getting a bad service and when there is a sudden surge in demand, a very bad service. If Google, Amazon, Azure or Apple ran servers like that the whole experience of using their services would be crap. I get telecoms companies are a bit of a different beast and they definitely do priorities profits over service but that's not how the big cloud/services operate at all. Nor Government for that matter, and Government is a much easier place to make the economics of severs work because you're not chasing a profit.



If any of Google's services, but particularly search or voice, were oversubscribed and responses were not near-instant, very few people would use then. If you're a telco with limited competition you can risk providing a shit service. What are people going to do? Send letters?



You need blades geographically near every customer. If you have customers that travel a lot, you need blades in servers near all the places they travel. Cloud-based services are hugely popular with travellers. You follow?
Why would you oversubscribe mission critical services? I’m not sure I understand this type of questioning. You have various levels of need with various levels of price points, I would not expect that subscribing to game pass ultimate and getting xcloud for free I should also expect to get near last mile blade deployments so I can get extra low latency; it sounds all too good to be true. At least in its current iteration, maybe there is an xcloud+ Service coming which they deploy closer to the customer, but, today I’m expecting them to all be situated within existing azure data centres.

I’m also expecting it to be heavily oversubscribed. There just aren’t enough xcloud players.
 
Last edited:

see colon

All Ham & No Potatos
Veteran
And I don't think it's necessarily true that non-gaming applications cannot be run on xblade.
So in the future we can stream netflix through a virtualized xbox environment to out phones and control playback with an on screen controller. And then go full circle and play Minecraft Story Mode on netflix so you are streaming a game that's offline rendered and distributed via a non-gaming application then streamed via XCloud. But it's a game within a non-game.
 

DSoup

Series Soup
Legend
Subscriber
Why would you oversubscribe mission critical services? I’m not sure I understand this type of questioning. You have various levels of need with various levels of price points, I would not expect that subscribing to game pass ultimate and getting xcloud for free I should also expect to get near last mile blade deployments so I can get extra low latency; it sounds all too good to be true. At least in its current iteration, maybe there is an xcloud+ Service coming which they deploy closer to the customer, but, today I’m expecting them to all be situated within existing azure data centres.

I’m also expecting it to be heavily oversubscribed. There just aren’t enough xcloud players.

Firstly, gaming isn't "mission critical". Nobody dies or suffers casualties when somebody can't game on the cloud. Secondly, you said your company oversubscribes server capacity. This is not my line of questioning, you said that in this post. But clearly work in a company that is very obviously focussing on profit first and service second (like all telecoms companies) which is not how I would expect Amazon, Google or Microsoft to operate their server platforms. They need to be profitable, but not by maximising profits at the expense of the service provided.

I am struggling to keep-up with the of your posts. You started with severs changing gaming as we know it, to changing development, to problems servers solve/create, to how quickly servers can be deployed, to their profitability and then their maybe-future long-term profitability.

I think this is one of those conversations they has gotten away from the point and maybe drawing a line under it would be the best for my sanity ;-)
 
Top