D
Deleted member 13524
Guest
Father of XBox Seamus Blackley just came up with a surprise on twitter in the form of an apology:
https://www.techspot.com/news/91749-xbox-creator-apologizes-amd-over-last-minute-switch.html
Wow, imagine you're an engineer who put your effort on those working prototypes, sitting on the front row of the announcement event just to find out you've been ditched during the presentation. Auch.
So what AMD CPU went into the OG Xbox? It ended up with a parent of the Coppermine Celeron, probably the same 180nm 28million transistor Coppermine as the Pentiums with half its L2 cache disabled but maintaining 8-way associativity.
Should we guess it was going to take an Thunderbird Athlon? Thunderbird was pretty much neck and neck with Coppermine at the time, both in performance and die size.
I also guess switching a Socket A CPU for a Socket 370 one was a relatively easy task at the time, which is why such a last minute change was possible.
AMD acquiring ATi was still a ways off at the time, so should we assume it would still use the NV2A?
Also, could this have resulted in any difference in the software development front? For example, could game devs have used 3DNow! instead of SSE?
https://www.techspot.com/news/91749-xbox-creator-apologizes-amd-over-last-minute-switch.html
Wow, imagine you're an engineer who put your effort on those working prototypes, sitting on the front row of the announcement event just to find out you've been ditched during the presentation. Auch.
So what AMD CPU went into the OG Xbox? It ended up with a parent of the Coppermine Celeron, probably the same 180nm 28million transistor Coppermine as the Pentiums with half its L2 cache disabled but maintaining 8-way associativity.
Should we guess it was going to take an Thunderbird Athlon? Thunderbird was pretty much neck and neck with Coppermine at the time, both in performance and die size.
I also guess switching a Socket A CPU for a Socket 370 one was a relatively easy task at the time, which is why such a last minute change was possible.
AMD acquiring ATi was still a ways off at the time, so should we assume it would still use the NV2A?
Also, could this have resulted in any difference in the software development front? For example, could game devs have used 3DNow! instead of SSE?