That was never the goal of any of the Xbox until, arguably Xbox One. You seem to be equating abstraction with compatibility but abstraction is designed to solve a different set of problems. As for abtracting code to run on different processors, nobody does this. Apple's iOS platform comes closest but this relies to recompilation of code as it's downloaded and/or updated by the App Store.
The difference in execution is tangible. Microsoft used attraction APIs and found an affordable way to run previous generation Xbox games on latter hardware which not only ran better, but looked better. Sony have used close-to-the-metal APIs which which limited their hardware choices and when they did deliver a limited form of B/C, the 'improvements' were limited to screen filters.
Another way to put it is that up until X360, MS has always (well at least since Windows 3.0... 2.0 and 1.0 were a bit rough) look both forwards and backwards WRT compatibility. It's a cornerstone of their approach to making computing accessible to the masses, small businesses and corporations.
IMO, even the original Xbox was based on this. It was basically just one step away from being just a standard Windows box and thus easy (for them) to maintain compatibility going forwards. However, after they learned all about the various pitfalls involved in competing in the console space WRT hardware, they changed course with X360. Their mistake with the OG Xbox was not knowing that reducing the cost to manufacture of the console over the course of a generation is arguably far more important than the cost of bringing the console to market.
So basically,
- They determined that they couldn't continue with x86, Intel was certainly not interested in reducing any costs related to their CPUs, especially at that point in time.
- They found that NV was transitioning from the GPU maker that made cheap affordable 3D accelerators in order to establish themselves into a company that was far more interested in maximizing operating margins. IE - they also weren't interested in reducing costs related to their GPUs.
- AMD wasn't viewed as viable (low production capacity compared to Intel) and likely weren't interested in low margin parts at the time (this was when their Athlon 64 X2 architecture was coming into the picture, something that would outperform Intels chips for a time). That low manufacturing compacity hurt their ability to compete with Intel even when they had a compelling architecture. By the time they got their new Fab online, Intel were competitive once again...
So, they needed to find a new GPU and CPU partner. ATi was already at least somewhat familiar with the console business and it's associated costs having acquired ArtX back in 2000. IP from that also being used to get them competitive in the GPU space again (R300 and forwards). So that worked well.
Then the problem of getting a CPU. Well, at this point x86 was out of the question as MS needed a partner who was willing to pass on reduced manufacturing costs to them. Something neither Intel nor AMD were interested in at the time. So, in comes IBM who needed more partners.
At this point, WRT backwards compatibility, we run into the roadblock of having to make code targeting NV proprietary IP run on ATi hardware. Something NV weren't interested in without large sums of money being involved.
And there we run into the next lesson that MS learned.
- While the OG Xbox had a Direct X like API, developers could still bypass the API and code directly to the metal (and many of them did just that). That becomes a problem if you are looking at compatibility across different hardware if the features being used are patented.
- You see this crop up from time to time even on the PC side of things. S3TC, for example.
-
While S3 Graphics is no longer a competitor in the graphics accelerator market, license fees have been levied and collected for the use of S3TC technology until October 2017, for example in
game consoles and graphics cards. The wide use of S3TC has led to a
de facto requirement for OpenGL drivers to support it, but the patent-encumbered status of S3TC presented a major obstacle to
open source implementations,
[4] while implementation approaches which tried to avoid the patented parts existed.
[5]
From Wikipedia.
Anyway, because of this we see Microsoft moving to more abstraction in the XBO. Hypervisor, virtual machines, making the API more prevalent, etc.
Then for BC, get around some of the licensing costs associated with NV IP by having games from Xbox run in a virtual machine. They likely still have to pay NV some sort of licensing fee (assuming they couldn't come up with clean code to get around it), but at this point it's at the Virtual machine level and not per title. That also has benefits WRT future compatibility across hardware generations while also allowing developers some freedom to bypass the API.
And now we're back full circle to a machine that is again just a step or two away from being a standard Windows machine. Except now, things are sufficiently abstracted that in theory they could move to any architechture in the future as long as it had hardware components that were fast enough (CPU, GPU, etc.) to emulate a previous console in a virtual machine.
Obviously they'd like to avoid that and stick to x86 + PC GPU derived tech., but with their work in getting Windows on Arm working (currently compatibility focused, but at some point they're likely to start focusing on cross vendor CPU performance) I don't think it's out of the question that at some point in the future an ARM based console is possible for MS.
Regards,
SB