Xbox360 Uncloaked eBook

Tap In

Legend
Just finished this the other night and it is very intriguing.

Dean Takahashi did a great job weaving you through the inside story on the 360 starting back before the original Xbox even launched. There is a 20 year plan for MS within the industry and the 360 is the next step in that advancement.

He describes some of the internal struggles, each person's background and how and why they ended up at MS (or leaving MS). I was amazed at how many former Nintendo execs are in the games division at MS. Evidently, having their corp. offices so close in Redmond was convenient for attracting talent, especially after the big Nintendo Exec shake-up.

It seemed to be a pretty unbiased opinion of Ms and their approach to the industry and revealed just how much MS tried hard to balance power and technology with long term costs; creating a situation for aggressive price drops into the life of the machine to get it to the mainstream as soon as possible. Contrary to popular belief, it appears as if MS execs are NOT about to throw money at the Xbox business (aside from the start up costs). It stands on its own. Mr Takahashi's personal opinion is that MS got lucky that Sony missed the spring launch. He also suggested that they should have thrown more money at the 360 instead of being so cost conscious.

The stories about creating the chips and choosing the vendors were my favorite part. (An OOOE processor was initially promised by IBM and planned for but later scrapped when IBM could not execute). Intel and Nvidia were apparently in the running up until the final decision. MS jobbed out to teams of world class designers for everything from the box to the controllers to the logo, trying to appeal to a broader market than Xbox1. They designed software to aid in long term cost analysis for any single part or cost scenario with which they could just plug in and see the long term effect. Not to mention the strength of the software tools for the dev kits and their service which apparently is/was very appealing to the devs. They surveyed dozens of devs on what they wanted in the system and what they could do to make it better than it had been in the past. They created software for the fabricators to use to track issues in real time and to keep MS apprised at all times.

The lack of quality GDDR3 memory was evidently the cause of the bottleneck in production. Besides a lack of sufficient quantity last year, they had to take the Infineon parts and test them for speed on the factory floor before installation.

Interestingly enough the disc format decision was almost a non issue. They decided to go DVD from day one without a 2nd thought (based on devs desire for fast streaming, cost, an unalterable plan to release in 2005 and the fact that a format war loomed).


Anyway... good read... $15 here...


http://www.spiderworks.com/books/xbox360.php


edit: changed Samsung to Infineon
 
Last edited by a moderator:
I was amazed at how many former Nintendo execs are in the games division at MS. Evidently, having their corp. offices so close in Redmond was convenient for attracting talent, especially after the big Nintendo Exec shake-up.

Huh, well that explains why Nintendo's been fumbling so much ever since the release of the gamecube. If both Sega and Nintendo had given more autonomy to their US divisions, maybe they wouldn't be in the situations they are today. (ok, sega would have been right to cut off the 32x, but both Sega of America and NOA seemed much better at determining games interesting to American markets than their Japanese counterparts did)

An OOOE processor was initially promised by IBM and planned for but later scrapped when IBM could not execute

Why were they unable to execute? It's not like IBM doesn't have other OOOE processors to build off of, at the very high end (Power 5) their OOOE implementation may even be superior to AMD and Intel, or at least was at the time.
 
Fox5 said:
Why were they unable to execute? It's not like IBM doesn't have other OOOE processors to build off of, at the very high end (Power 5) their OOOE implementation may even be superior to AMD and Intel, or at least was at the time.

this was all he said:

IBM knew that it could make a derivative of the efficient PowerPC core that it had created for Sony without a huge redesign effort. It anticipated that it would be able to include a feature known as out-of-order execution.

Another setback was that IBM had also decided that it couldn’t do out-of-order execution. This was a modern technique that enabled microprocessors to dispatch a number of tasks in parallel. A sequence of instructions was broken into parallel paths without regard to order so they could be executed quickly, and then put back into the proper sequence upon completion.
Instead, IBM had to make the cores execute with the simpler, but more primitive, in-order execution. Out-of-order consumed more space on the chip, potentially driving up the costs and raising the risks. When Microsoft’s Jeff Andrews went to Jon Thomason and told him the news, it was like a bombshell. One by one, many of the Mountain View group’s biggest technological dreams were falling by the wayside.

I'm assuming that it is a factor of having 3 cores and trying to stay within the cost/size/heat parameters.

Also of note was some real good info into their software (game) decisions (mostly with Ed Fries) including acquisitions, foul-ups (missing the exclusive on GTA as they had first crack at it) Halo2 and what it did to Bungie etc. and the evolution of EA as a supporter.
 
Last edited by a moderator:
Interesting.
So the out of order execution was just cut out to meet the triple core goal though, if they were originally planning for 3 3.2ghz out of order cores, that would have been an incredible accomplishment at 90nm. Well, assuming 3 cores of a Pentium 4, G5/Power4, or Athlon class, 3 Power3s may have been doable.
 
Fox5 said:
Interesting.
So the out of order execution was just cut out to meet the triple core goal though, if they were originally planning for 3 3.2ghz out of order cores, that would have been an incredible accomplishment at 90nm. Well, assuming 3 cores of a Pentium 4, G5/Power4, or Athlon class, 3 Power3s may have been doable.
The 3 core IBM solution was being compared to an Intel 4+GHZ single core monster, which after signing the IBM deal was (according to the book) eventually scrapped in favor of slower multicore solutions.

The book mentions MS's satisfaction at having (luckily) made the correct CPU choice in that regard.
 
Fox5 said:
Interesting.
So the out of order execution was just cut out to meet the triple core goal though, if they were originally planning for 3 3.2ghz out of order cores, that would have been an incredible accomplishment at 90nm. Well, assuming 3 cores of a Pentium 4, G5/Power4, or Athlon class, 3 Power3s may have been doable.

I wonder just how much of it comes down, finally, to the costs. That may be an example of Mr Takahashi's opinion on MS spending more money. The other factors were size and heat though. MS was concerned about the die space but also a bigger box due to cooling needs was *out of the question* in their plans.

I'd bet an IBM 3 core OOOE was doable from a technological standpoint. (but I'm no E Engineer) :)

oh, BTW, nearly every single dev interviewed by MS, preferred a FAST single core solution. :D
 
Last edited by a moderator:
But IBM already had dual core PPC OOOE chips in BlueGene, so why wouldn't MS just get a version of those? 2 OOOE PPCs are likely to be faster than 3 XGPU cores.
 
DemoCoder said:
But IBM already had dual core PPC OOOE chips in BlueGene, so why wouldn't MS just get a version of those? 2 OOOE PPCs are likely to be faster than 3 XGPU cores.

the performance per watt is bad, and the 2 core probably run a lot hotter then the 3 xcpu cores.

I also don't know how well they compare to other options, the cpu's that Apple were using never got over 2.5GHz
 
Does the book say anything about why they decided to jump to 512 MB of RAM so late in the game? And does it also talk about the HDD issue?
 
256MB was the initial preferred choice, due to cost, but evidently Xenos can support up to 1GB. It seems they decided that they costs would be worth it after devlopers urged them to do so (including Sweeney / Epic demoing the difference in quality of UE3 with 256MB and 512MB).
 
Tap In said:
I'd bet an IBM 3 core OOOE was doable from a technological standpoint. (but I'm no E Engineer) :)

Don't need to be one to know that based on the dual G5, it just isn't feasible at 3.2GHz with 3 cores on one die when considering heat dissipation issues. :p

(link is to arstech article showing the dual G5 90nm's heatsink )

Closer proximity of the cores (much greater thermal density) and a 28% increase in clockspeed per core and an additional core would be stretching things for the size of the overall unit and not to mention the weight! :oops:

Any mention of future plans? ;)
 
thanks Alstrong
ok, well I'm lazy. ;)


as for the future, I don't think it was mentioned but I do know that another part of their original 360 hopes were to use a chip with 8 or 16 cores and 64 unified shader pipelines instead of 48 on Xenos. :devilish:


also:

“In the end, IBM had a really powerful PowerPC core that was small and efficient,â€￾ said Greg Gibson, the system designer. “It was the only one that could have given us three cores in one CPU in a small form factor, low cost, low power with the integrated cache memory.â€￾
But Spillinger said that the changes to the instruction set that Microsoft wanted – which meant changes in the cores that IBM had already created – meant that the entire design had to go through a complete verification from the ground up. Some of the instructions were not compliant with the PowerPC architecture and were owned by Microsoft itself. But IBM had the rights to the pieces of the chip that it was designing for Microsoft, and it had the right to use those again if it wanted.
as to what Dave said about the Sweeney/Epic demo comparing 256 - 512 MB ram, it was interesting because after that, Robbie Bach had the final decisison and it was a one time 900 million dollar decision to go to 512. :oops:

one
Does the book say anything about backward compatibility?
oh yea.

Going with Intel/Nvidia would have (obviously) made that easier.

They had found that a tiny fraction of PS2 users had ended up actually using BC and they initially were not going to include it at all. Ed Fries and some others insisted and the concern over servicing customers who consistently used Xbox LIVE (with titles like Halo2 that were not on 360) became an important issue. They actually decided to pursue it forcefully after the chip decisions were made.

Eventually, the team assigned Drew Solomon, a hardcore graphics expert and low-level operating system guru, to head a group of top-notch programmers and engineers. Their mission was to research backward compatibility and assess whether it would be technically possible. Allard referred to this group as the “ninjas,â€￾ after elite Japanese assassins. :LOL: No one would hear a definitive answer from them for a long time. The team had to pursue several different paths to get to their goal of making backward compatibility work through a software emulator.
 
SO no OOO that is interesting.

On the topic in the other day I as thinking way did they go with such a CPU as a dual core 970 at 2,2Ghz would probably be a better performer (relatively low latency, 5 ops/sec, OOO, well know CPU and good tools) it would probably need some extra flops power (dual VMX or something like), would still have ~the same power consumption, less transistores and, I think, smaller die size.

Now we got the answer, nice, thanks.
 
Tap In said:
as to what Dave said about the Sweeney/Epic demo comparing 256 - 512 MB ram, it was interesting because after that, Robbie Bach had the final decisison and it was a one time 900 million dollar decision to go to 512. :oops:

Wow.:oops: About a billion dollars. What the hell? I wonder why it cost so much?
 
mckmas8808 said:
Wow.:oops: About a billion dollars. What the hell? I wonder why it cost so much?

well I imagine they were basing it on the life of the machine and the estimated sell through of units so it was speculation but as far as the decision making was concerned it was a real number.

Of course those costs started immediately and eventually led to the shortage of 360 units due to the Ram being the bottleneck issue.

They had to have it though, 256 would have been ridiculously limiting to the devs IMO. I wanted 1 gig. :devilish: ;)
 
Tap In said:
as for the future, I don't think it was mentioned but I do know that another part of their original 360 hopes were to use a chip with 8 or 16 cores and 64 unified shader pipelines instead of 48 on Xenos. :devilish:

Xenos has 64 ALU, but 1 array of 16 is disabled. I actually wonder if they were hoping yields would be good enough for 64. Probably not (as redundancy is important on such large chips being manufactured for such a high volume market), but the batch size (64 instructions) seems slightly mismatched with the shader ALUs. The X1800XT and X1900XT have the same batch size as shader ALUs. But this could be wrong conjecturing on my part.
 
Acert93 said:
Xenos has 64 ALU, but 1 array of 16 is disabled. I actually wonder if they were hoping yields would be good enough for 64. Probably not (as redundancy is important on such large chips being manufactured for such a high volume market), but the batch size (64 instructions) seems slightly mismatched with the shader ALUs. The X1800XT and X1900XT have the same batch size as shader ALUs. But this could be wrong conjecturing on my part.
oh yea I remember vaguely reading here about the 64 and the yields. As for the batch size (64) I don't know nearly enough to understand what effect that would have. :p

they did say that they "hoped" for 64 and "expected" 32 ALUs.

BTW, both chips went from signed contracts to working chips in just 14 mos. Although they both started preliminary work prior to official signing.
 
Last edited by a moderator:
Batch size of Xenos is 64, R580/RV530 48 and R520/RV515 16. There is an element of commonality between all of them and they ALU configurations - the batch sizes are all 4x the width of the ALU array (per command processor).
 
Makes me wonder what CPU they'll use in the future. My memory is a little hazy, but it seems like the only consoles that will have relatively common hardware architectures are the GCN and Wii. It would be "smart" for MS to follow a similar path there, and perhaps add in OOOE to the design specs. But then I wonder if they will turn out to follow a Cell-like asymmetry that AMD is also kind of planning with co-processors instead of duplicate cores. I guess that all depends on the next set of surveys on what developers want.

Question: If more eDRAM were included, would it be a simple matter to enable AA for the games that don't make use of predicated tiling this generation? (speaking in terms of backward compatibility. e.g. enable AA on Perfect Dark Zero on a supposed future console with 16MB eDRAM for example.)

Did the book mention plans for Vista cross-compatibility? I heard somewhere that one would be able to play an X360 title on PC using the same disc. Also, one of the interviews regarding Halo 2 for Vista mentioned that they would "implement content streaming from the DVD".
 
Back
Top