NVIDIA GT200 Rumours & Speculation Thread

Status
Not open for further replies.
But the thing is that there's little if any relation between what it costs to build the card and what the card is sold for.
The actual manufacturing costs of processors isn't very high at all.
What makes chip manufacturing expensive is mostly the development of the design and the investment in the production facilities for these chips. Once all that is in place, building the chip doesn't cost anywhere near $400 to produce, let alone $600.
Most of the price of GPUs, CPUs and that sort of hardware is for return on investment, not for the manufacturing costs of the actual physical product.
I'd be surprised if a GTX260/GTX280 chip would cost more than $50 to manufacture.

You just can't compare it with most other products where you mainly pay for the materials and the labour. A chip is very small, with very little material, but most of them cost more than their weight in gold. And it doesn't take much labour/time to build a chip.

Guess what, you are wrong. It's well above $100.
 
Most of the price of GPUs, CPUs and that sort of hardware is for return on investment, not for the manufacturing costs of the actual physical product.
I'd be surprised if a GTX260/GTX280 chip would cost more than $50 to manufacture.

My estimates have it down as 2-4x that, depending on actual yields and working on very rough wafer costs. There should be less than 100 chips to a wafer and thats before binning them based on defects, clock speed, and TDP.
 
Even if that were true, it wouldn't change my point. It wouldn't be a problem to sell a $400 card with a $100 chip.
But I don't believe you're right about the $100.

I believe he is, infact it's probably quite a bit over $100.

Then add in the PCB costs, memory, etc. The partners margins are already tiny, so if you want to wipe $200 off the cost of a product it's NVIDIA who have to take the hit.
 
My estimates have it down as 2-4x that, depending on actual yields and working on very rough wafer costs. There should be less than 100 chips to a wafer and thats before binning them based on defects, clock speed, and TDP.

Well, here are some interesting wafer costs: http://www.icknowledge.com/economics/wafer_costs.html

Would be nice to have some actual, recent info from TSMC.
But I don't think it's going to be orders of magnitude off.

Your estimates say they'd cost $100-$200 each...
And 100 wafers from a chip...
Let's assume 50% yield, which should be very pessimistic.
Then you'd get 50 working chips off a wafer... which would mean you get $5.000-$10.000 revenue out of a single wafer.
So what do you think a wafer costs? I don't think it's anywhere near $10.000 (for nVidia's bulk orders anyway).
 
Last edited by a moderator:
When I say no yield issues, that comes from a perspective of what Nvidia anticipated the yield to be.

The bogus rumor going around for a few weeks that was pulled from thin air was that Nvidia was having yield problems with their 280s.

The bogus claim implied that Nvidia was not meeting their predetermined yield goals for the 280s. That claim is WRONG.

Congratuation to them, NV :D for achieving their target on 280's yield.

Anyway, we will see if the target is met or miss very soon.

Thanks you for kindly reply. :smile:
 
Those are for empty wafers, and only down to 90nm.

Funnily TSMC will charge them extra to actually create a chip on it.
 
I believe he is, infact it's probably quite a bit over $100.

Then add in the PCB costs, memory, etc. The partners margins are already tiny, so if you want to wipe $200 off the cost of a product it's NVIDIA who have to take the hit.

Not really. These cards are usually bundled with less memory, slower memory, and in the case of the 8800GTS, the PCB and cooler were slightly smaller aswell, cutting more costs.
So yes, nVidia will be the one taking the hit, but no the actual hit won't be the full $200. Some of it is just 'designed' away by using cheaper parts.
And some of it will be compensated by the fact that the GTX260 will be a higher volume product than GTX280.
 
Well, here are some interesting wafer costs: http://www.icknowledge.com/economics/wafer_costs.html

Would be nice to have some actual, recent info from TSMC.
But I don't think it's going to be orders of magnitude off.

Your estimates say they'd cost $100-$200 each...
And 100 wafers from a chip...
Let's assume 50% yield, which should be very pessimistic.
Then you'd get 50 working chips off a wafer... which would mean you get $50.000-$100.000 revenue out of a single wafer.
So what do you think a wafer costs? I don't think it's anywhere near $100.000.
Your math is suspect. If a wafer costs ~$8,000 (which is probably about right) and you only yield 50 working chips per wafer, then it costs you ~$160 to make a single working chip. If you turn around and sell those 50 chips for $100, then you've lost money.

Your $50,000-$100,000 is assuming the selling price of a chip is $1000? How will you make a consumer product out of a chip at that price? Maybe you meant $5,000-$10,000, but now you might notice that your in the price range of a wafer.

-FUDie
 
Even if that were true, it wouldn't change my point. It wouldn't be a problem to sell a $400 card with a $100 chip.
But I don't believe you're right about the $100.

Let's have a different point of view.
G200 has 10 clusters. 9 of them can run at 650. The 10th one has an imperfection and it's going to hold the other at 600. Bigger die bigger chance that this is going to happen. That's why I believe the size matters.
 
Your math is suspect. If a wafer costs ~$8,000 (which is probably about right) and you only yield 50 working chips per wafer, then it costs you ~$160 to make a single working chip. If you turn around and sell those 50 chips for $100, then you've lost money.

Your $50,000-$100,000 is assuming the selling price of a chip is $1000? How will you make a consumer product out of a chip at that price? Maybe you meant $5,000-$10,000, but now you might notice that your in the price range of a wafer.

-FUDie

Right, should be one 0 less ofcourse. Still the wafer costs at TSMC will be low for nVidia (high volume), and yields won't be as low as 50%, so I still come out at estimating ~$50 per GPU (as a startup, it will probably get lower over time as the production matures).
Bottom line is, the GPU production costs, even at the most pessimistic estimate here, $200, is only 1/3 of a $600 graphics card... and it doesn't cost $400 to put it on a PCB with some memory and a cooler either.
 
Your math is suspect. If a wafer costs ~$8,000 (which is probably about right) and you only yield 50 working chips per wafer, then it costs you ~$160 to make a single working chip. If you turn around and sell those 50 chips for $100, then you've lost money.

Your $50,000-$100,000 is assuming the selling price of a chip is $1000? How will you make a consumer product out of a chip at that price? Maybe you meant $5,000-$10,000, but now you might notice that your in the price range of a wafer.

-FUDie

Well, "consumer" products sold at $1,000+ are nothing out of the ordinary. How much for a top-end Intel CPU ?
But, then again, the exact same G200 core can be sold at over $5,000 on a Quadro FX, or even much more in Quadro Plex/Tesla rack systems.
This is 10 times more on a market segment already dominated by Nvidia. Not exactly loose change, wouldn't you say ?
 
Let's have a different point of view.
G200 has 10 clusters. 9 of them can run at 650. The 10th one has an imperfection and it's going to hold the other at 600. Bigger die bigger chance that this is going to happen. That's why I believe the size matters.

Ofcourse size matters, but that's all relative. It all depends on how well nVidia chooses its specs for GTX260 and GTX280.
It's not a problem at all if GTX260 would outsell GTX280 by 5:1 or so. That's only natural for cheaper products with better value-for-money. That would also mean that only one in 5 would have to be 'imperfection-free' as far as GTX280-specs are concerned. As long as at least 2 out of the remaining 4 also fit GTX260, you're already at 60% yield on the wafer. If you're lucky, you get 4 out of 5, giving you 80%.
And something like that will probably happen, yields between 60% and 80%, with about 20% of them GTX280.
 
Well, "consumer" products sold at $1,000+ are nothing out of the ordinary. How much for a top-end Intel CPU ?
I thought we were talking about graphics cards here.
But, then again, the exact same G200 core can be sold at over $5,000 on a Quadro FX, or even much more in Quadro Plex/Tesla rack systems.
This is 10 times more on a market segment already dominated by Nvidia. Not exactly loose change, wouldn't you say ?
And what is your point? Do you see nvidia selling chips for $1000 into the consumer segment, or not? I assume not because the estimated retail price of the 280 is going to be significantly lower than $1000 (~$650) which implies that the chips are going to be sold for far less than $1000.

Are you just looking for an argument?

-FUDie
 
Right, should be one 0 less ofcourse. Still the wafer costs at TSMC will be low for nVidia (high volume), and yields won't be as low as 50%, so I still come out at estimating ~$50 per GPU (as a startup, it will probably get lower over time as the production matures).
AMD K8 dual cores from an in-house fab are estimated to cost in the ballpark of $50 per chip for fabrication, testing, and packaging.
I seriously doubt a chip that is massively larger and produced at a foundry is that cheap.

Bottom line is, the GPU production costs, even at the most pessimistic estimate here, $200, is only 1/3 of a $600 graphics card... and it doesn't cost $400 to put it on a PCB with some memory and a cooler either.
That $600 is Nvidia's only if it made and sold its cards directly. Otherwise there are several layers of expenses for customers and retail on top of the revenue from the MSRP.
 
Right, should be one 0 less ofcourse. Still the wafer costs at TSMC will be low for nVidia (high volume), and yields won't be as low as 50%, so I still come out at estimating ~$50 per GPU (as a startup, it will probably get lower over time as the production matures).
You're dreaming if you truly think G200 will only cost $50 to produce. I'd guess it's much closer to the $160 that I came up with.

-FUDie
 
Ofcourse size matters, but that's all relative. It all depends on how well nVidia chooses its specs for GTX260 and GTX280.
It's not a problem at all if GTX260 would outsell GTX280 by 5:1 or so. That's only natural for cheaper products with better value-for-money. That would also mean that only one in 5 would have to be 'imperfection-free' as far as GTX280-specs are concerned. As long as at least 2 out of the remaining 4 also fit GTX260, you're already at 60% yield on the wafer. If you're lucky, you get 4 out of 5, giving you 80%.
And something like that will probably happen, yields between 60% and 80%, with about 20% of them GTX280.

They don't choose, they are forced to lower the clock and I am sure the die size has something to say about that. It has direct impact. Nothing relative.
 
Status
Not open for further replies.
Back
Top