NVIDIA GT200 Rumours & Speculation Thread

Status
Not open for further replies.

Because there are higher end models based on the same core which do turn a profit. Makes more sense to sell a low-end chip for below cost than to just throw it in the bin.

If the wafers contained entirely loss making parts then the product line would be end of life'd pretty sharply! It's also better to sell an AMD chip and hopefully make some money back on a 780G or 790X + 3870 than have the person buy Intel.
 
And what is your point? Do you see nvidia selling chips for $1000 into the consumer segment, or not? I assume not because the estimated retail price of the 280 is going to be significantly lower than $1000 (~$650) which implies that the chips are going to be sold for far less than $1000.
Its not unheard of for cards to sell for $1000+ though (8800 ultra oc variants released come to mind)
 
Because there are higher end models based on the same core which do turn a profit. Makes more sense to sell a low-end chip for below cost than to just throw it in the bin.

But wasn't the whole discussion about the fact that such a strategy would not be valid for nVidia?
But still I really doubt that AMD throws away THAT much money.
Just look. If I buy 100+ of these at Newegg, I pay $42 a piece.
Now, Newegg itself isn't giving them away for free either... so they actually bought it from their distributor for less than $42 a piece.... which probably bought it from AMD for even less.
Let's say that there's an additional $5 per unit somewhere between AMD and the Newegg consumer, very optimistic (I know plenty of products where the price doubles or more from factory to consumer...).
That means AMD sells them for $37 a piece.
And they cost $50 to make... so they want to throw away $13 on every low-end CPU they sell? As in 26% loss? (Where X4200 is probably one of their highest volume parts)...
Inconceivable!
If you said they cost $40 to make... perhaps... but $50 is just too much.
I personally think they'll cost more in the range of $25-$30 to make. And even then they'll probably not turn in much profit if they end up in stores for $42.
 
You just can't compare it with most other products where you mainly pay for the materials and the labour. A chip is very small, with very little material, but most of them cost more than their weight in gold. And it doesn't take much labour/time to build a chip.
You're right. Which is why the cost of a chip is set almost entirely by a (relatively simple) function of die area, process type and product maturity.

You can derive the formula easily: number of dies on a wafer; the yield, set by the defect rate ; costs of testing and packaging.

The defect rate is the only tricky bit - this is nonlinear, say if you have 10 defects randomly distributed and 10 dies you have ~60% working dies, but if you have 100 dies on the same wafer with the same number of defects you have 99% working dies. Then you also have the question of what you can get away with for binning and recovery (but as has been pointed out here many times before, recovery is potentially dangerous to your overall profitability, since the recovered chips always outsell the 'perfect' chips).

Of course, because of the nonlinear function, without information you won't have you'll never be able to make a particularly accurate guess at the actual die cost - and a factor of 2 out makes a massive difference to assessing whether something's a profit or loss...
 
But wasn't the whole discussion about the fact that such a strategy would not be valid for nVidia?
But still I really doubt that AMD throws away THAT much money.

You might want to look at their quarterly reports. :LOL:

Just look. If I buy 100+ of these at Newegg, I pay $42 a piece.
Now, Newegg itself isn't giving them away for free either... so they actually bought it from their distributor for less than $42 a piece.... which probably bought it from AMD for even less.

Perhaps you're missing the point. Other cores that come off of the same wafers are selling for more. With the lowest binned parts AMD has 2 choices, they can throw them in the garbage, or sell them at a loss. So while they are losing money on the bottom end chips they still lose less than not selling them at all. And the entire wafer can still be profitable because the higher binned (5000+) parts can still sell for profit.

Let's say that there's an additional $5 per unit somewhere between AMD and the Newegg consumer, very optimistic (I know plenty of products where the price doubles or more from factory to consumer...).
That means AMD sells them for $37 a piece.
And they cost $50 to make... so they want to throw away $13 on every low-end CPU they sell? As in 26% loss? (Where X4200 is probably one of their highest volume parts)...
Inconceivable!
If you said they cost $40 to make... perhaps... but $50 is just too much.
I personally think they'll cost more in the range of $25-$30 to make. And even then they'll probably not turn in much profit if they end up in stores for $42.

I don't know that the x4200 would be a high volume part when you can get 25% more performance for $25 more.

It's quite possible newegg makes almost nothing on them. A low priced cpu could reward them with a main board sale or other peripherals.
 
You're right. Which is why the cost of a chip is set almost entirely by a (relatively simple) function of die area, process type and product maturity.

The cost of the chip is, but the market price of the chip generally isn't. That's determined by its performance, and by what the competition is doing.
In general the manufacturer with the fastest chips will decide what the price/performance is for the market, and the other manufacturers have to position their chips within the scale set by the leading manufacturer.

A good example is Intel's introduction of the Core2 line. Because Intel delivered more performance for the same price, AMD's CPUs halved in price almost immediately, and continued dropping to only a fraction of the original price in only a few months time.
Obviously this has nothing to do with the production cost, because AMD's manufacturing was already reasonably mature by the time Core2 was introduced, and even if it wasn't, it wouldn't improve that much, that quickly. For example, the 6000+ tumbled from about $1000 to $500 quickly, and continued to drop... today it's sold for $139, while it is still basically the exact same chip as it was when Core2 was introduced.

A similar example is the recent 8800GTS vs 2900XT battle. While technically the 2900XT is more or less in the same league as the 8800GTX in terms of manufacturing, they had to sell it at 8800GTS price levels because of its performance.

Apparently the leading manufacturer generally has a comfortable profit margin built into their products, and they can have pretty spectacular price drops if required, without going bankrupt.
GPUs don't have such large margins as CPUs, but the market leader is still pretty comfortable. nVidia turns in very healthy numbers every quarter.
 
Just for fun - I like nVidia's reviewer's guides. They are pretty inconsistent and unfinished for every major launch. GTX200 RG is again quite inconsistent in marketing names, but nothing compared to old G80 slides... do you remember?



this one was my favourite... it's quite topical these days :)

 
Don't underestimate the price differential of top GDDR3 IC's between early 2007 and today... ;)
I think my suggestion was that for the G80 to come into the $299 price range, Nvidia had to wait 4 months ..

Are you sure :?: Are you working for TSMC :?:

No yield issues hopefully does not mean by 0% defect :!: If yes, why bother to produce 260 since NV can produce a lot of 280 and earn profit :cool:
He's claiming to be be under NDA but his post history suggests otherwise. :!:
 
Last edited by a moderator:
Given that all signs point to G92b as a strictly midrange product for the 9800 GT (meaning, no 8800 GTS 512MB/9800 GTX-like variants), i'd say there's still quite a gap to be filled between 199 and 399~449 dollars.
Especially the 9800 GTX, because i believe it's a very short term solution, just like the 9800 GX2.

There could well be a new G92b based GX2 product to fill in that gap - it would make sense if the G92b is smaller, cheaper and cooler, making the GX2 a more viable sub-G200 gap filler.

It would be interesting if G92b also launches on the 17th. I wonder how far behind the smaller GT200-derived parts with less than 240 SPs are.
 
I think my suggestion was that for the G80 to come into the $299 price range, Nvidia had to wait 4 months ..


He's claiming to be be under NDA but his post history suggests otherwise. :!:

I am not claiming to be under NDA. In fact I am not under NDA.
 
Few years ago I was reading on a forum (http://realworldtech.com/ ) that initially K8 yield was ~ 60% and AMD struggled to get up to 80%. Given the source I do believe that.... and somehow doubt NV+TSMS can get even 50% on first batches of G2xx . Then, maybe the rumourde 6 months delay are not rumours and they used the time to improve yields... they had the time....
 
You have to realize that for TSMC 65 nm is not a new process (nor for nVidia, by the way, their current G92 is built on this process aswell). They've been using it for years and it's very mature by now.
Aside from that, AMD is not a good example, because they have often been struggling with their manufacturing.
 
You have to realize that for TSMC 65 nm is not a new process (nor for nVidia, by the way, their current G92 is built on this process aswell). They've been using it for years and it's very mature by now.
Aside from that, AMD is not a good example, because they have often been struggling with their manufacturing.

yes there's not much reason to compare g200 to k8, especially when you consider that g200 is 3x the size.
 
Fudo comments on pricing:

http://www.fudzilla.com/index.php?option=com_content&task=view&id=7888&Itemid=1

Price wars have already started. Nvidia already dropped GTX 280 to $499 and it looks that GTX 260 will sell for $399.

Nvidia is still telling its partners that $399 is the launch price for Geforce GTX 260 but it looks that this card will quickly go down to $349. With $349, GTX 260 looks much more attractive and it will do well against $329 priced Radeon HD 4870 but as we suggested once ATI feels threatened it will simply drive the prices down.

Nvidia sells Geforce 9800GTX at $299, which means if GTX260 price goes down, Nvidia has to drive 9800GTX down too. Radeon HD 4870 beats 9800GTX and there is your problem. At this point, at least when it comes to the average selling price of Nvidia cards, we simply have to say DAAMIT what have you done.

See also: http://www.fudzilla.com/index.php?option=com_content&task=view&id=7889&Itemid=1

and: http://www.fudzilla.com/index.php?option=com_content&task=view&id=7890&Itemid=1
 
Status
Not open for further replies.
Back
Top