NVIDIA GF100 & Friends speculation

Here are the results from Damien's Barts review: http://www.hardware.fr/medias/photos_news/00/29/IMG0029787.gif

At 800/1000, the GTX 460 is just 6% slower than the 470 (1920×1200 AA4X) and 1% slower than the 6870.

The GTX 560 could be up to 17% faster than that: (384/336)×(820/800) = 1.17. Let's call it 10% faster.

And here, the 6950 is just 7% faster than the 470.

So I'm expecting HD 6870 < GTX 470 < GTX 560 < HD 6950, all within a ~12% performance range, going by that last graph. I guess drivers could help Cayman, though.

The question than remains what are the prices ? If these are gtx460 replacements than they dont want it to cost close to 6950 and mainly gtx570. And if they will be priced near gtx460 than who will need a 3bilion transistor gtx570 ?
 
Out of curiosity, what are the available facts that show nVidia is losing money on the 460? All I see is people disagreeing based only on their own presumed omniscience.

Some people didn't realize that this article was the work of a comedian., especially the part about $24 for GDDR5, $15 for a PCB (WTF?), $10 for a heatsink, $10 for packaging and accessories (WTFx2? Do they ship in gold plated boxes with Monster cables?), $5 for assembly and testing and $7 for misc other components (WTFx3?).

Unbelievable that an editor allows this kind of BS to be published.

Change the $71 figure to something reasonable (~$35?) and you have a product that's highly profitable.
 
Why do you think those prices are out of bounds? Do you have any experience or better sources?
 
Why do you think those prices are out of bounds? Do you have any experience or better sources?
I know the production cost of other mass produced electronic widgets. $35 may be a bit aggressive, but $71 is ridiculous.

Let's kick $4 off the RAM for a still generous $20.
$5 off the PCB.
$8 off the packaging and accessories. (What accessories? A $0.3 DVI cable?)
$3 off the misc components and you're already down to $51.
(No idea about the cooler, but $10 seems generous too.)

Back in the day, the GTS250 1GB was launched for $150, with a PCB and cooler that's roughly the size of the GTX460. It typically sold for around $130. Unless you believe that Nvidia loves to introduce products that can barely make a profit, that should give some indication that $71 is not realistic.
 
One for instance, most reference GTX 460's are shipping with a mini-HDMI to HDMI convertor (the EVGA one I have is gold plated as well), that cable alone is going to be far from a cents solution.
 
Why do you think those prices are out of bounds? Do you have any experience or better sources?

I think in this situation one should tend toward the common sense assumption that companies don't sell millions of a product at a loss just for kicks. It's the people who are claiming the product is selling at a loss that bear the burden of proof this time around.
 
I think in this situation one should tend toward the common sense assumption that companies don't sell millions of a product at a loss just for kicks. It's the people who are claiming the product is selling at a loss that bear the burden of proof this time around.

um... companies sell products for a loss all the damn time. (Because not selling at all is worse than selling for a loss).
 
um... companies sell products for a loss all the damn time. (Because not selling at all is worse than selling for a loss).

That also depends on how much the company has in the bank, what the total loss will have on your financial results and whether or not your shareholders can handle the loss in profit or ASP quarter after quarter. At some point something will give - you either go bust, your share prices drops, you get bought out or alternatively you stop selling for a loss.
Retaining market share (which is why a company would want to sell for a loss in the first place) is not the end goal. To remain in business and at a healthy profit is.

As to NVIDIA specifically, they have an uncanny ability to make money, I am sure they are not in any real difficulty selling at a loss short term but if anyone thinks they make a habit of it then I really can't see any proof of that.

I reckon NVIDIA still make a tidy profit on the 460 ...
 
Fair point about bandwidth. Let's look at it differently. Compared to the GTX 470 (which is a tad faster than the 6870) and assuming that the leaked specs are true (384 SPs, 820MHz) and that my math is correct, then the GTX 560 would have:
You're comparing against the wrong target. Just look at the 460s, and let's use the hardware.fr data you pointed to:
http://www.hardware.fr/articles/804-23/dossier-amd-radeon-hd-6870-6850.html

An 18.5% clock speed increase gives the 460 OC a 15% boost. 16% clock speed increase and 17% shader count increase gives the 6870 a 15% boost over the 6850. Since clock speed will scale equally on ATI and NVidia, it's clear that the extra shader units aren't doing much in this suite of tests. At 6870 clocked at 6850 speeds would perform maybe 3-4% faster than the 6850.

The 560, assuming these specs are correct, is going to be roughly the same speed as the 6870. At hardware.fr, it'll probably be a bit faster. Other sites, maybe a bit slower.
 
Err, if SPs meant nothing then why are the most powerful chips the ones with the most SPs.

If you seriously believe a near 20% increase in the number of SPs will be worth 3-4% then go ahead, but I seriously doubt it.

My best estimate is GTX560 coming in 20-30% faster than GTX460, with the bigger gains in DX10 games. The new order will be:

6990 > GTX580 > 6970 = GTX570 > 6950 > GTX 560 > 6870 > GTS550 > 6850

ATi will have the fastest card, but Nvidia will have faster solutions in the single chip categories but with bigger die sizes. If Nvidia can imitate powertune and get a dual chip card going they will get the top card crown too, but even so it will be a pretty crazy card.
 
Err, if SPs meant nothing then why are the most powerful chips the ones with the most SPs.

If you seriously believe a near 20% increase in the number of SPs will be worth 3-4% then go ahead, but I seriously doubt it.
For Cypress, this is true. Can't find it now, but there was some article running HD5850 and HD5870 at the same clocks. Other than synthetics, the performance difference was 0.5-4% (for 10% more shaders), with an average of about 2% IIRC. Or just look at Barts if more SPs would mean much it would be nowhere near Cypress performance. (Don't forget, these 10% shaders use nowhere near close to 10% die area, so it's not quite as bad as it sounds.) The same is true for Cayman btw - but I haven't seen someone running tests for HD 6950 and HD 6970 at the same clock (but that simd scaling must be near nonexistent follows from the clock and perf difference between these two).
That said, I think there's reason to believe GF104/GF114 (or GF100/GF110 for that matter) scale better with more SMs. I was assuming something like roughly 50% scaling - so for that additional SM (+17%) you'd indeed get somewhere along 10% improvement.
 
um... companies sell products for a loss all the damn time. (Because not selling at all is worse than selling for a loss).

Care to quantify "all the damn time" and explain what it has to do with proving nVidia's 460 is losing money for the firm? Selling a product at a loss is an undesirable and temporary situation - it doesn't happen "all the damn time" at all in the grand scheme of things. It certainly doesn't happen just because people think they can understand the economics of GPU die-sizes with zero actual cost data. Its interesting how those popular theories don't explain AMD's relatively unimpressive GPU profits.
 
um... companies sell products for a loss all the damn time. (Because not selling at all is worse than selling for a loss).

No they don't. It happens, but if you do it all the time you go bankrupt. It is pretty obvious that they don't do it all the time.
 
True. You are missing the biggest fail of them all though. AMD. Went from tech giant to near also-ran because they sold all of their consumer chips at a loss from 2006-2009 trying to keep up with Intel and ended up spinning off their foundry business and seeking investment from the middle east, Hector almost ruined them to try and maintain market share. This is what happens to a company which sells their products at a loss for too long.

Charlie has insinuated that Nvidia have been selling their whole lineup since GTX285 for a loss. I very much doubt that is true, and other people here have cast doubt on it for good reason, i.e. Nvidia have made a decent amount of profit, and the loss they made was related to asset and inventory write downs.

On a more general note, I would like to point out that any company that rests on its laurels will pay for it, AMD paid for it dearly and still are to some extent. They still haven't got an answer to Core and all that mindshare, marketshare and profit they earned with Athlon turned to dust on the 14th of July 2006. All because they thought Intel were out of it and they didn't bother with a new architecture to compete thinking that Athlon was the best. Have they learned from that experience, I'm not sure, ATi have, and they seem very proactive about keeping up, but AMD don't seem to have, delay after delay for fusion has allowed Intel to steal the initiative on APUs (albeit more basic ones) and they can't seem to get Bulldozer out of the door while Intel have just shipped Sandy Bridge, their second 32nm processor architecture.

Nvidia seem very wary of making the same mistakes, that is why they have expanded into HPC, and SoCs, an area where the traditional CPU makers are completely out of the game. Moorestown is terrible from what I have seen and AMD don't have anything in that range, while Nvidia are inking deals to supply 10-15 devices with Tegra 2 next year increasing their non traditional revenue and protecting them further from market shocks (ATi hitting one out of the park leaving them busted like AMD after Core).
 
Back
Top