Xbox Thought Experiment

I want to acknowledge up front that I know I might be wrong with this train of thought, but wanted all you smart guys to weigh in on it with your informed opinions. :)

[Btw, I'm assuming no chip shortages in this thought experiment]

My initial reaction 3 years ago when MS and Sony announced their specs for this generation was that MS should have gone in heavier. If I were running MS I would have set out to put an extra $100 of hardware in the box for the same price.

My thinking is simply this: If the goal is to sell 100 million units for MS, then taking an extra $100 loss per unit would have cost them $10 billion, which is far less than the $77 Billion they're spending on Bethesda/ABK. In fact, they can probably afford to do both. Over the course of 10 years they could slowly cost reduce it so they were losing less per box maybe to where they only lost $5 billion overall, but gained more than that back in software royalties or GP subscriptions all while hurting Sony quite a bit.

1st question: What do you think MS could have packed in the box for an extra $100, with the goal of having the box still be no bigger than the PS5 if possible?

2nd question: If this extra hardware meant that all these 30 fps UE5 games coming soon were 60 fps instead or the 60 fps ones had full RT as well, would consumers care enough that MS might have gained lots of market share from this move?

Right now the market is 38m PS5s to 22m X|S , could we be more like 32m to 28m right now? 6 million extra units might be 30 million more software units sold or maybe 3 million GP subs. That isn't really going to replace your $1 billion losses on hardware every year, but market parity outside of Japan would have likely solve their Sony advantage with 3rd party exclusives problem.

TLDR: Can hardware power get you enough extra market share, especially with early adopters, to make it a viable long term strategy?

PS: The only time Xbox was clearly more powerful was the original Xbox and Sony had a massive hardware/software headstart, so I don't consider this a valid example. :)
 
Last edited:
Historically the most powerful console didn't win out. Inferior boxes marketed better sell better. We're still struggling to see what differences this gen makes to last, let alone just an extra 25% hardware! I don't think the differences would be enough to win MS international market share.

For far less expense, if MS courted the EU as well and Sony does, they'd secure share there. AlthoughI don't want to derail your thought experiment and lose the emphasis on hardware. ;)
 
Actually historically the more powerful box has won out: PS2 beat Dreamcast and GameCube (MS was too late to market), PS3 beat X360 (marginally, but they still did), PS4 beat Xbox One.

Also, I don't think $100 only gets you 25% more power as most of the money can probably go into APU/Memory. But others more knowledgeable than me here might educate us on that.
 
Actually historically the more powerful box has won out: PS2 beat Dreamcast and GameCube (MS was too late to market), PS3 beat X360 (marginally, but they still did), PS4 beat Xbox One.

Also, I don't think $100 only gets you 25% more power as most of the money can probably go into APU/Memory. But others more knowledgeable than me here might educate us on that.

PS3 and X360 had different strengths and weaknesses, you can't just use a blanket statement that one was better than the other without qualifying in what way it was better. X360 had the better general CPU, better GPU and unified memory (making development easier), for example. The PS3 had a CPU that was a lot better at parallel processing as long as you could properly feed it and code for it (extremely difficult as only 1 was really useable for general processing while the rest were specialized for SIMD) as well a split memory pool (no memory contention between CPU and GPU, but more limited memory pools that require juggling data between the two, IE - making development more difficult relative to X360).

Over the course of the generation the PS3 was always significantly harder for developers to develop their games on and for most multiplatform games it showed with the X360 having the graphically better versions of those games. Do not discount ease of development when trying to conclude which is more powerful because how easily you can develop for a hardware platform is equally important or more important that how powerful the hardware is on paper. It's one of the main reasons that Sony ditched custom esoteric hardware with the PS4 the cost to develop bespoke hardware is really expensive and would mean software development on that hardware would suffer whereas more standardized hardware would lead to easier development and by extension better looking and performing games in general. Basically more development time could be spent on the game rather than trying to make the hardware work as well as the specs on paper imply it could.

Regards,
SB
 
Last edited:
According to documents the majority of the series X shortages came from funding xcloud.

So they basically chose period to bet on an unknown future.

I don’t know if they did the right thing, I honestly think nvidia did it right. The premium service makes sense to me because annual fee * 8 years is the cost of a 4080.

But they tied in games into xcloud which is also a good deal if you just want to play games. But for those with limited time, and they have specific tastes, being able to purchase a game and choosing the performance level makes a lot of sense too.
 
I'd agree with Shifty on marketing winning generations + software library. Sony's been winning that since PS1, despite a few hickups.

With regards to the two questions, can't see how $100 more CUs/RAM/bandwidth would have allowed for a meaningful difference this gen. It wouldn't have got you a 60fps SeriesX lording it over a 30fps PS5.

Also, without knowing the BOM, is suspect the XSX is already more expensive to make. Bigger chip and more storage, even if it is slower.
 
PS3 and X360 had different strengths and weaknesses, you can't just use a blanket statement that one was better than the other without qualifying in what way it was better. X360 had the better general CPU, better GPU and unified memory (making development easier), for example. The PS3 had a CPU that was a lot better at parallel processing as long as you could properly feed it and code for it (extremely difficult as only 1 was really useable for general processing while the rest were specialized for SIMD) as well a split memory pool (no memory contention between CPU and GPU, but more limited memory pools that require juggling data between the two, IE - making development more difficult relative to X360).
Yup. PS3 had a theoretically vastly better CPU but it relied on games needing a lot of paralisallable computational requirements, but in practise PS3 vs 360 generally meant PS3 having less RAM and less flexible RAM usage (due to PS3's XDR/GDR memory arrangement), 360 having 10mb EDRAM and a processor that made AA 'free', and generally less mad development tools which resulted in a bunch of games running significantly better on 360 vs PS3 because PS3 was just batshit crazy. PS3's GPU had old separate shader/pixel shader architecture, versus the Xenon's unified shader architecture on top. And for those technical "privileges", PS3 also cost a lot more!
 
Last edited by a moderator:
I think the issue is the series s is off the mark by baby steps. Console should have been 5-6TFlops with 12gigs of ram at $300 . Even now they should be focused on getting that series s sku down to $250 and below for the holiday. They should also just go ahead and start buying up large swaths of cheap developers so they can provide a lot of exclusive content on their platform. I'm talking about 5-10 new developers a year.
 
Just to add, PS3 was advanced in recognising that compute was the future, but PS3 bet architecturally poorly that the best place for compute resource was on the CPU, whereas it became very evident quickly - as evidenced by subsequent generations of gaming hardware - that the best place for redundant compute was in the margins of unused GPU hardware.

And that's where we still are today.
 
Just to add, PS3 was advanced in recognising that compute was the future, but PS3 bet architecturally poorly that the best place for compute resource was on the CPU, whereas it became very evident quickly - as evidenced by subsequent generations of gaming hardware - that the best place for redundant compute was in the margins of unused GPU hardware.

And that's where we still are today.

GPU's just naturally made sense because they were already massively parallel architectures and designed for massively parallel workloads. It's a nature fit.

CPU's are usually treated as general purpose dispatchers WRT those types of workloads (hand it off to the GPU and let the specialized hardware handle it). Making CPUs massively parallel when most CPU tasks are inherently branchy is a huge undertaking and means that even if you succeed, in most tasks that massively parallel architecture would then mostly go to waste.

Regards,
SB
 
This generation I feel like Microsoft is less interested in selling consoles than it is selling services. They don't have to have the most powerful box, they need to monetize the content. One way to do that is to sell a console and then sell games and services. Another is to monetize the content without the console. This is why they kept Minecraft multiplatform, and why Gamespass and cloud streaming exist.

So could they make a better hardware platform and afford to take the loss? Yes. Would it be better than getting people to subscribe to Gamepass who don't own the hardware that you lose money on? Probably not.
 
My thinking is simply this: If the goal is to sell 100 million units for MS, then taking an extra $100 loss per unit would have cost them $10 billion, which is far less than the $77 Billion they're spending on Bethesda/ABK.

You can't compare them like that.

The Activision purchase is effectively an tangible asset purchase Even in an overpay scenario you would only be losing a portion of the money in any sense of the word.

Selling the consoles at a loss is an actual realized loss/cost.

I kind of wanted to post this in the main thread on it since this issue has been touched on a few times. MS and Sony have differing monetary advantages.

MS has the bigger bank in terms of being able to buying assets hence they are targeting acquisitions to shore up their console business.

Sony has the bigger bank in terms in terms if it's actual console business hence they can directly write off costs for exclusive contracts.
 
Sony has the bigger bank in terms in terms if it's actual console business hence they can directly write off costs for exclusive contracts.

I doubt they'd have to write off much if any of that as the revenue that it brings in will be higher than the revenue that a 3rd party exclusive would currently bring in for MS. Assuming the exclusivity terms cost the same (unlikely since an exclusivity contract with Sony hurts the developer less than an exclusivity contract with MS due to the current install bases), it'd be significantly harder (or impossible, especially as the actual cost of exclusivity will be much higher for MS than what Sony would have to pay) for MS to recoup those costs.

So, without a doubt it makes far more financial sense for MS to buy studios than it is for them to buy exclusivity. Sure, MS as a company can afford to buy exclusive content, but that's going to be extremely difficult to justify to investors who are not interested in Xbox generating losses. Because of investor relations, it's much easier for MS to justify buying an asset (studio) as the outgoing cash is replaced by the incoming value of the asset with the continued potential for another revenue/profit stream.

Regards,
SB
 
I'm not sure MS would want to spend the extra $100 for competent RT performance, at the time, if it meant compromising backwards compatibility because they'd had switched to Nvidia hardware.
 
Just to add, PS3 was advanced in recognising that compute was the future, but PS3 bet architecturally poorly that the best place for compute resource was on the CPU, whereas it became very evident quickly - as evidenced by subsequent generations of gaming hardware - that the best place for redundant compute was in the margins of unused GPU hardware.

And that's where we still are today.

The ps3 problem was trying to win the media format war by using the ps3 like they did with the ps2 . That made the system late and very expensive. If it didn't have bluray we might have seen 756 megs of ram or even 1gig of ram on the platform at the $500 mark for sure

The second issue is they thought just designing a gpu themselves would be enough. However the industry passed them by. The ps2 was lucky it was coming off the ps1 and was a for the time cheap dvd player and sold 150m units or whatever and devs were forced to wring every last drop of performance for it. But the dreamcast , gamecube and xbox showed that going to the 3d graphics board makers was the way of the future. The ps3 went way to far into development before they went to nvidia and nvidia gave them older technology vs a custom build like ati gave MS
I'm not sure MS would want to spend the extra $100 for competent RT performance, at the time, if it meant compromising backwards compatibility because they'd had switched to Nvidia hardware.

Are we talking about xbox series x ? If I was able to subsidize a $100 to get more performance out of the xbox series x I'd likely just go with more ram. 24 gigs of ram on the series x and 16 gigs on the series s along with bumping up the series x to 5 or 6tflops would have made for much more compelling consoles.
 
24 gigs of ram on the series x and 16 gigs on the series s along with bumping up the series x to 5 or 6tflops would have made for much more compelling consoles.

Agree with regards the S. Those extra flops and ram would probably make it no effort for devs to reach 1080p parity with the X.

On the X side, not sure what 24gigs really buy you? The graphics features that make high end PCs stand out vs consoles are better raytracing hardware than AMD offers and DLSS.
 
Agree with regards the S. Those extra flops and ram would probably make it no effort for devs to reach 1080p parity with the X.

On the X side, not sure what 24gigs really buy you? The graphics features that make high end PCs stand out vs consoles are better raytracing hardware than AMD offers and DLSS.
24GB is an odd choice if it's the only thing you are changing. With Series X's memory configuration, 20GB would be the way to go. That way you wouldn't have some of the memory at a slower speed, along with the extra size.
 
24GB is an odd choice if it's the only thing you are changing. With Series X's memory configuration, 20GB would be the way to go. That way you wouldn't have some of the memory at a slower speed, along with the extra size.

Not just that, but 24 GB means you would need a wider bus and more traces on the MB, both of which would increase the cost beyond just the extra 4 GB from going from 20 to 24. Going to 20 GB would only incur the cost associated with increasing the memory capacity of 2 of the chips.

Regards,
SB
 
Agree with regards the S. Those extra flops and ram would probably make it no effort for devs to reach 1080p parity with the X.

On the X side, not sure what 24gigs really buy you? The graphics features that make high end PCs stand out vs consoles are better raytracing hardware than AMD offers and DLSS.

24GB is an odd choice if it's the only thing you are changing. With Series X's memory configuration, 20GB would be the way to go. That way you wouldn't have some of the memory at a slower speed, along with the extra size.

Not just that, but 24 GB means you would need a wider bus and more traces on the MB, both of which would increase the cost beyond just the extra 4 GB from going from 20 to 24. Going to 20 GB would only incur the cost associated with increasing the memory capacity of 2 of the chips.

Regards,
SB
20 or 24gigs at the end of the day it doesn't really matter , I'd be interested in much better textures .

I do wonder if infinity cache would have been a good call if we were adding $100 to the cost. Or of course you could skip the APU and have a dedicated Zen3/+ and separate RDNA 2 gpu. I think this would be the most costly of the options however but you could have much more in the way of CU count. Could even have a 12 or 16 core CPU
 
20 or 24gigs at the end of the day it doesn't really matter , I'd be interested in much better textures .

Shouldn't everyone be using virtual these days, especially now consoles are on SSDs? RAM shouldn't dictate texture resolution.

Obviously not all titles do. As a layman, hard to understand why when Trials Evolution was too many years ago.
 
Back
Top