If you have 100 customers that want xcloud, you don't need more than 4 blades. Probably at most 20 of them are playing at any given time, and that would likely be very generous. I don't understand this need to have a single console for every single player all at the same time.
Dsoup touched on this briefly, but unlike most use cases for distributed server processing, gaming requires the server hardware to be as close to the end user as is physically possible due to latency being far more impactful to the end use that the server hardware is servicing. It's fine if, for example, a search request takes 100 ms to reach a user (the data transmission, not the search calculation) who may not be local to that data center servicing that search request. It's not fine, however, if the result of a player's input takes 100 ms to reach their screen.
This means that you can't amortize downtime in one region by having that hardware process some gaming requests from other regions. That's one of various methods for data centers to mitigate server downtime that would be inappropriate for gaming hardware in a data center.
This means that for gaming loads, you can't oversubscribe to nearly the same degree without it greatly impacting the service being provided. In any given locality, the vast majority of request for service will happen with a relatively small window on weekdays. Weekends will have wider windows than weekdays but it will still be some fraction (likely less than half) of the day. IE - even in the best case scenario (weekends) server hardware that is only being used for gaming is going to be sitting idle for long periods of time.
So, for that case, you have to be able to service peak hour usage, which means a lot of idle hardware during off peak hours. And unlike less latency sensitive server workloads one locality operating at peak load can't leverage idle servers 3+ time zones away without greatly impacting the end user experience.
Thus, the issue is, how do you use all of that gaming server infrastructure when local demand for that hardware (say midnight local time, for example) might be in the single digit percentages WRT the gaming server hardware in that datacenter?
Obviously the inclusion of features that are mostly unnecessary for gaming (like robust ECC memory support) gives some hints in how MS would try to leverage the Xbox Series blades for non-gaming uses. The question is, what uses would be conducive to using that hardware without impacting the core use case (gaming) of that hardware? However, that also means that any non-gaming oriented design features of that server hardware will be "idling" when servicing the core use (gaming) for which that hardware was installed.
Regards,
SB