xbdestroya said:
I think 3.2 GHz is going to be a walk inthe park for these chips - have you seen the schmoo for them?
I never said it wouldn't be. I just said low clock graphics processors and high clock CPU's can't have the same transistor density.
Also it's not a good comparison to use NV vs R vs AMD series chips; the architectures and the processes of different fab houses all play a role. NV suffering with a bad IBM process back in NV30 days is just one example of freak occurence and deviation from the norm.
I could keep giving you examples all day. NV30 was no bigger than it should have been, and it's poor performance had to do with architecture, not process. It's a fully valid example with nothing "freak" about it in terms of transistor density. I could cite R520 vs. Winchester, and again the density is about 30% more in the former. Prescott is higher density than Winchester, but it has a lot of transistor-dense cache.
But even with all of that aside - and I grant that the die sizes might be similar in size in the end due to different densities, certainly you understand where Cell with larger economies of scale and the ability to withstand defects should come in naturally lower in price to RSX, right?
Umm, why? RSX and cell will have very similar volume, since cell sales beyond PS3 will be relatively light. RSX is so closely related to G70 that it's yeilds will be easy to get high, relatively speaking. G70 is already hitting 500MHz+ (judging by third party clock speeds and rumours of a 550MHz 7800GTX) on a 0.11 um process.
And to say nothing of the fact that on the same process, with ~70 million more transistors, and using the EE and GS as precedent with Sony, I fully expect RSX to be larger as well.
I've already explained why this is unlikely to be so by any significant margin. EE and GS is irrelevant - NVidia is supplying the graphics this time.
I mean c'mon, this most recent BOM has the Cell at over three times the cost of RSX in certain places. WTF?
I see $160 vs $100 for Cell vs. RSX initial cost. I suppose you looked deeper into the document than me, but the final numbers are what matters.
Look at how big GPUs are in the last few years, and how the price of the whole card compares with CPUs. The GPU cores are nearly always bigger, and cost less. NVidia and ATI prefer more transistors to higher clock speed due to the parallelizability of graphics, so they make big, dense chips and sacrifice clock speed to get yields acceptable. CPU makers like high clock speed, so they sacrifice yields to get it.
Finally, IBM's cut for these chips will be much bigger than what ATI/NVidia/TSMC/NEC/Toshiba will take for these chips.
In any case, the point of this article is to compare XB360's price against PS3's. Cell has over 40% more transistors than Xenon, and when you take into account that larger chips have lower yields, $160 vs $100 for Cell vs. Xenon is very fair. If you think Cell should cost 30% less, then Xenon will cost 30% less as well. That makes the PS3 44% more expensive at launch than XB360 instead of the 46% currently in the report. Big deal.