Cost to Nvidia per NV40 die produced

Xigen

Newcomer
How much does it cost for Nvidia to produce a NV40 in materials only?

Dave B stated that due to the size of the chip, there are 175 dies per wafer.

Due to the complexity and size of the chip, lets assume that each wafer results in only 100 working dies.

Alphawolf said:
$2500-$3300 /300mm wafer on a .13 process.

RussSchultz said:
As Alphawolf said, somewhere in the several $k range per wafer.

So if we take $3000, and divide it by 100, we get $30. So even if you add another $5 for packaging, the total materials cost only comes to $35.

So with these figures, it looks to me that it is easily economically viable to build high end graphics chips of this size and still make a good profit. In fact, would it not be feasible to design a core that is even larger than the NV40 on the same process and still make good money selling it (I am assuming that Nvidia sells NV40 chips for an average of $70-$100)

Please tell me how wrong I am in my calculations above ;-)
 
Looks like you are talking about marginal cost, not total cost - ie you are not including any recouping of the R&D expenses to get the GPU as far as the fab.
 
Many of the costs are in the memory/board/packaging. And long-term expenses as Marketing/R&D, as Gnep says.
Plus, remember AIBs want their margins, too.

I'd be surprised if there were 100 16x1 boards out of 175 - but considering the NU is 12x1, I guess that type of yield overall is perfectly plausible. But don't ask ME to say if it really is or not lol


Uttar
 
Xigen said:
How much does it cost for Nvidia to produce a NV40 in materials only?
Dave B stated that due to the size of the chip, there are 175 dies per wafer.

Due to the complexity and size of the chip, lets assume that each wafer results in only 100 working dies.
AFAIK, if now they have more than 30-35 dies with all 4 quads working at 400MHz, they'll jump for joy :LOL: :D
 
Gnep said:
Looks like you are talking about marginal cost, not total cost - ie you are not including any recouping of the R&D expenses to get the GPU as far as the fab.

Yup, I was talking about 'materials only' cost. I was looking at just this aspect because the other costs will be reasonabley static, whatever the die size is. Also, the cost of memory and assorted ic's is in the hands of the board makers, I just wanted to look at it from the GPU makers standpoint.

So the materials only costs decide what size of die is economicly feasable. With my back of an envelope caluclations, although everybody is saying the NV40 is large, Nvidia could have actually produced an even larger chip, with better proformance and added just a few $ to the cost to the consumer.

What do people think?
 
Shipping aint free either. Ever get something shipped from Tiawan? I shipped something to Korea and it cost a fortune. So you can probably add another $3-$5 per wafer at least.
 
Those numbers are probably in the ball park, though on the high side (from my guess). The real number could be more by maybe 20%, or less by 50%, depending on yield and cost per wafer.
 
Probably the biggest chore is the cost of memory. The more complex it gets9the core), the more r and d it requires, and the higher the cost of everything overall. I think thats why we see higher and higher costs on top end video cards. (on the other hand longer and longer times to produce games) Accroding to what i saw though, this trend is going to be bucked because of the relatively lower prices the 6800 will suppossedly bring.

Kinda early to tell just yet i think.
 
At that die size they'll only be getting around 30% yeild, so its a pricy chip! That said if they're using the 12 pipe trick to make use of some of the remaining 70% as non ultra parts they can probably trade some margin on the sale price of the non ultra to bring down the cost of the ultra...

John
 
Gnep said:
Looks like you are talking about marginal cost, not total cost - ie you are not including any recouping of the R&D expenses to get the GPU as far as the fab.

R&D is a sunk cost i.e. not factored into ROI. The other costs you mentioned are variable, not marginal. There are no fixed costs to factor in.


Second page of Anand's review had wafter, eyeballed it and there was around 150-160 chips per wafer. Question is, what's the yield and what kind of deal does IBM have with nVIDIA. Are they paying per wafer or per die that passes bench testing? Given the fact that they've been trying to bust into the foundry business, I'd bet it's the second option not the first.
 
Xigen said:
How much does it cost for Nvidia to produce a NV40 in materials only?

Dave B stated that due to the size of the chip, there are 175 dies per wafer.

Due to the complexity and size of the chip, lets assume that each wafer results in only 100 working dies.

Alphawolf said:
$2500-$3300 /300mm wafer on a .13 process.

RussSchultz said:
As Alphawolf said, somewhere in the several $k range per wafer.

Wow, I would have expected a 300mm 0.13u wafer to cost closer to $4000-5000, especially since IBM is not known to be a 'cheap' fab (i.e. price-competitive with TSMC/UMC.) And historically IBM's ASIC delivery contracts are priced on 'delivered known good die (KGD)', not processed whole wafers. But then again, I'm an engineer and not a businessman, so I haven't kept current with the business dealings between NVidia and its foundries -- Someone on this forum stated that NVidia had a KGD-contract for NV3x & TSMC, which is a major departure from TSMC's usual business practice.

http://www.eetimes.com/semi/news/sh...id=D3OVZNHGIB0UKQSNDBGCKHY?articleID=18901773 - IBM reports $150 million loss in IC unit, blames 300-mm fab yields

For Nvidia's sake, let's hope IBM's Fishkill troubles are behind them! Will IBM also be producing the 'mainstream' and 'value' NV4x parts? Or just the flagship high-end NV40?
 
It is the economics of diminishing returns. With every new press forward in gpus/cpus more money is poured in for a smaller gain. Which basically means it becomes less and less profitable as time goes on to press forward at a high speed like gpu's have been lately.
 
asicnewbie said:
...For Nvidia's sake, let's hope IBM's Fishkill troubles are behind them! Will IBM also be producing the 'mainstream' and 'value' NV4x parts? Or just the flagship high-end NV40?
Since they want to ensure high-yields and very high output on the value and mainstream-parts, i think TSMC would be the logical choice, 0.11µm seems to be doing quite well btw, but 0.09µm is still ways off.

But it is hard to tell really, since NV already has all of their design-work done based on IBM´s process, it would probably be cheaper if they can continue on that (this is based on "knowledge" from their 4,8,12,16-pipeline-theory). I don´t think that converting designs from one process to the other is THAT easy, but my knowledge of that is very limited unfortunately.
 
asicnewbie said:
Wow, I would have expected a 300mm 0.13u wafer to cost closer to $4000-5000, ... And historically IBM's ASIC delivery contracts are priced on 'delivered known good die (KGD)', not processed whole wafers.
At $4000-$5000 for "100%" yield, it comes out to be ~$28 plus packaging and test.

About the same pricing.
 
Back
Top