Oban: Allegedly very poor yields

BoardBonobo

My hat is white(ish)!
Veteran
Saw this over at SemiAccurate >-----<.

Basically it's suffering terrible yields (makes the CELL look very good style yields) and there are fears that it is going to have a large impact on MS's ability to deliver their console on time.
 
What exactly are you talking about? Please explain, for us who refuse to click semiaccurate links on general principle.
 
I'm still wondering if Oban is just 32nm 360 with integrated eDRAM*.

*eDRAM, the manufacture of which isn't trivial what with deep trench capacitors.
 
This is so stupid. They wouldn't even be fabbing it now for a late 13 launch. There is nothing to have poor yields. It isn't coming out in Nov 12.

Lets not forget Charlie's last article claimed IBM was fabbing Oban (where now he claims it's X86) and that it was in mass production back in January 2012. Actually earlier, sometime in 2011, since his article was written in January 2012.

Charlie hates Nvidia and MS and "poor yields" seems to be the unprovable charge he may always throw at both. At least, he's been throwing it at Nvidia for years (and there's never any way to tell if he's telling the truth since we will never know yields). Works out well for him.

I'm still wondering if Oban is just 32nm 360 with integrated eDRAM*.

*eDRAM, the manufacture of which isn't trivial what with deep trench capacitors.

If there's any truth to Oban at all this would be my guess.

Although I've been in a lot of pro-anti EDRAM debates on this board where people tell me EDRAM isn't difficult to integrate or manufacture at all, despite the fact it's still not integrated on 360 and lags nodes. guess this would be more evidence against them if the case :p

Also I had a source recent tell me Charlies Oban stuff is complete bunk (at least w/respect to XB720). While I already assumed as much out of common sense, it puts my mind further at ease.
 
This is so stupid. They wouldn't even be fabbing it now for a late 13 launch. There is nothing to have poor yields. It isn't coming out in Nov 12.

Lets not forget Charlie's last article claimed IBM was fabbing Oban (where now he claims it's X86) and that it was in mass production back in January 2012. Actually earlier, sometime in 2011, since his article was written in January 2012.

Charlie hates Nvidia and MS and "poor yields" seems to be the unprovable charge he may always throw at both. At least, he's been throwing it at Nvidia for years (and there's never any way to tell if he's telling the truth since we will never know yields). Works out well for him.



If there's any truth to Oban at all this would be my guess.

Although I've been in a lot of pro-anti EDRAM debates on this board where people tell me EDRAM isn't difficult to integrate or manufacture at all, despite the fact it's still not integrated on 360 and lags nodes. guess this would be more evidence against them if the case :p

Also I had a source recent tell me Charlies Oban stuff is complete bunk (at least w/respect to XB720). While I already assumed as much out of common sense, it puts my mind further at ease.

What makes you say he hates Microsoft?
 
What makes you say he hates Microsoft?

I don't know about hating MS specifically, but he's pretty violently anti-Windows. He does his reviews on Linux machines, and lets just say if something has any driver issues with Linux he considers the hardware completely broken (for instance, this utter drivel.. http://semiaccurate.com/2011/01/02/sandy-bridge-biggest-disapointment-year/)

Charlie does seem to have some real sources and does seem to get real information now and then, which makes it at least entertaining to read. The real problem is his terrible judgment. And I mean beyond how much his obvious biases against certain companies skews his opinions, since this is something people don't have a hard time recognizing. I've heard accounts where he overheard information and grossly misinterpreted it and posted some blatantly wrong news as a result. And his reviews/technical analysis pieces not only tend to come well after all the other sites have done theirs but are just embarrassing in how ignorant they make him look. For instance http://semiaccurate.com/2012/09/06/a-brief-look-at-amds-steamroller-core/ and of course most of his readers take his word authoritatively. He mainly hears a presentation somewhere or sees some slides and makes some awful conclusions on it.

But I don't think anything can compare with that time he said that Kepler is at a performance disadvantage because games must spend a routine CPU cost hand-scheduling code to manually manage all 1536 shaders at once..
 
I don't know about hating MS specifically, but he's pretty violently anti-Windows. He does his reviews on Linux machines, and lets just say if something has any driver issues with Linux he considers the hardware completely broken (for instance, this utter drivel.. http://semiaccurate.com/2011/01/02/sandy-bridge-biggest-disapointment-year/)

Charlie does seem to have some real sources and does seem to get real information now and then, which makes it at least entertaining to read. The real problem is his terrible judgment. And I mean beyond how much his obvious biases against certain companies skews his opinions, since this is something people don't have a hard time recognizing. I've heard accounts where he overheard information and grossly misinterpreted it and posted some blatantly wrong news as a result. And his reviews/technical analysis pieces not only tend to come well after all the other sites have done theirs but are just embarrassing in how ignorant they make him look. For instance http://semiaccurate.com/2012/09/06/a-brief-look-at-amds-steamroller-core/ and of course most of his readers take his word authoritatively. He mainly hears a presentation somewhere or sees some slides and makes some awful conclusions on it.

But I don't think anything can compare with that time he said that Kepler is at a performance disadvantage because games must spend a routine CPU cost hand-scheduling code to manually manage all 1536 shaders at once..

I think Fermi runs tessellation in software is way better.
 
Back
Top