Looking on newegg, half of them (13) have dual fan coolers and 9/26 are clocked over 900 MHz. That does not come cheap.
The clocks are irrelevant. A fan is dirt cheap.
"Making it look a lot better than it was" is easy enough, just check the reviews that had overclocked cards up against AMD's stock cards. We even had toms increase the clocks on a stock card in a review vs Barts "because so many of them on newegg had overclocks."
I have no idea how this kind of 'reasoning' leads to a conclusion that a GF114 is sold with negative GMs.
Why didn't they just go with higher clocks?
Why didn't 7970 didn't go with higher clocks?
Why would that be ridiculous? Check out AMD's graphics segment revenues and you'll see they've been struggling to break even most of the year. This is with generally smaller chips. Why would Nvidia be any different? There is not a lot of money in consumer graphics cards when there is an ongoing price war.
When people don't use right terminology, it's often an indication that they don't really know what they are talking about either, and it's a drag to deal with the imprecision and resulting confusion.
I've worked for many, many years for fabless chip companies. Pretty much without exception, they never ever made a profit. Yet without exception, gross margins were in the 40% range. Some of the CEOs may not have been brilliant, but there were not total idiots either: nobody deliberately sells silicon for a lower price than needed.
The reason companies (or, in this case, a division) don't make a profit is because the huge NRE involved in bringing a chip into existence. But once it's there, it's pretty cheap to produce and you basically hope that you sell enough of them of recoup the NRE.
It is funny that you're talking GF114 as your example, because semiaccurate of all places, did
your exercise for GF104. This was discussed at length on Beyond3D, which is why it's better to move this discussion to PM.
First observation: even semiaccurate admits that you can break even on a GF104 based card. GF114 is essentially the same die, so the same thing applies.
But we are now 20 months later: yields on 40nm are amazing and wafer cost has gone down. If you factor that in, even by their numbers, a GF114 die can be sold very profitably.
But here's the kicker: semiaccurate got it wrong, as usual. I've been involved in productization of consumer electronics gadgets:
- their cost of the PCB is a riot. A Chinese manufacturer can sell you a similar size 10-layer PCB for $5, not $10. Nvidia can probably get it quite a bit lower.
- GDDR5 RAM does not cost $24. It's probably $15 or less.
- a dual fan heatsink? My guess is $7.
- packaging and accessories $10? Are you kidding me? How about $3?
If you ever have the misfortune to go to Shenzhen, you should go the
SEG Electronics Market and the surrounding shops, all in the same street. I did. It's where thousands of Chinese manufacturers sell their wares: one will only sell HDMI cables ($0.50?), the other only fans etc.
It's all dirt cheap, in volume.
Do the exercise: add the numbers. See how misguided your premise is.
If you consider the 6950 is selling for a bit more than the 560, and that AMD has barely made any profit in graphics most of the year, by your reckoning that must mean that they are losing money on all of their bottom end cards? I mean if you really believe that these $200+ cards are making them a fortune then there must be a loss elsewhere right?
The loss is in NRE, marketing, buildings, whatever. It's all in the open: just read their 10-K statement. What's without question is that the gross margins on high-end silicon are much higher than for low-end silicon. This has been stated again and again in conference calls.
How many people do you think buy 560's believing they are 560 Ti's?
Not many.
How many people do you think buy 560 Ti's with single fan coolers and cheap components thinking that they all hit 950 MHz easily?
How many people do you think overclock in the first place?