GF100 evaluation thread

Whatddya think?

  • Yay! for both

    Votes: 13 6.5%
  • 480 roxxx, 470 is ok-ok

    Votes: 10 5.0%
  • Meh for both

    Votes: 98 49.2%
  • 480's ok, 470 suxx

    Votes: 20 10.1%
  • WTF for both

    Votes: 58 29.1%

  • Total voters
    199
  • Poll closed .
Yield numbers in the 70% range are what I've heard through the grapevine, and make far more sense economically than some of the more ridiculous estimates out there.
70% sounds ridiculously high, the opposite end of the spectrum. Wasnt Cypress claimed to be yielding ~60%?

Also I would be wary of taking any grapevine propagated by IHV employees themselves and for good reason.
 
Yield numbers in the 70% range are what I've heard through the grapevine, and make far more sense economically than some of the more ridiculous estimates out there.

No way the yields are that high. GT200, nearly as large die was yielding ~62% on a ~2yr old process. On 40nm....
 
Earlier trash talking aside I find the discussion about "one chip GPUs vs two chips GPUs" more deep than it seems. What I find interesting is that the ones considering the hd5890 set-up a valid option/comparision against the gt480 don't push their thinking further (I consider the comparision valid but that's not my point). It has indeed funny implications, if the aforementioned comparision is valid we can almost state that AMD/ATI should have passed on Cypress all together :oops:
I re-hacked through reviews of HD5750/HD5770 in crossfire and the result are indeed pretty positive. (by the way I think that the HD5750 alone or in crossfire offer the best bang for bucks of this "gen" of cards). One could come to the conclusion that ATI may have produced HD5750 X2 and HD5770 X2 instead of respectively HD5850 and HD5870.

Back on earth it would actually mean that ATI would have given up on the highest part of the market (I don't know how would scale a crossfire set-up of X2 cards but I assume badly...) but honestly those are what I would call unreasonable set-up. For me I think that ATI should have done this and so go even further in their sweet spot strategy. It may even ATI to add some tweeks that were removed from the rumoured original Cypress project while still offering a >200mm² chip. ATI may have reaffect the saving from canned cypress development to crossfire support, game support, going forward I'm not sure it would have been a bad call.
 
70% sounds ridiculously high, the opposite end of the spectrum. Wasnt Cypress claimed to be yielding ~60%?

Also I would be wary of taking any grapevine propagated by IHV employees themselves and for good reason.
Whatever the reality, with the GTX 480 cards selling at about $500, and the GTX 470 cards selling at about $350, we can sort of guestimate that nVidia might sell these chips at around $150-$200 or so. If the die size is ~500mm^2, then there would be around 45 chips per 300mm wafer. If the cost per wafer is ~$5000, and the yields are 70%, then the cost per chip would be ~$160/chip. This would just barely allow nVidia to sell these chips at a profit, provided the 70% yield rate is the rate for GTX 470-capable chips, but some appreciable fraction of those can be sold as GTX 480's.
 
Whatever the reality, with the GTX 480 cards selling at about $500, and the GTX 470 cards selling at about $350, we can sort of guestimate that nVidia might sell these chips at around $150-$200 or so. If the die size is ~500mm^2, then there would be around 45 chips per 300mm wafer. If the cost per wafer is ~$5000, and the yields are 70%, then the cost per chip would be ~$160/chip. This would just barely allow nVidia to sell these chips at a profit, provided the 70% yield rate is the rate for GTX 470-capable chips, but some appreciable fraction of those can be sold as GTX 480's.

Who says Nvidia is selling at a profit? You're making a lot of assumptions there, and it wouldn't be the first time they spent money to keep marketshare.
 
What's the point of marketshare if not to earn profit?

Paying for it gives you breathing space in the short term while you sort out your yields, keeps your marketing rolling, leaves your partners with something to sell. At some point it becomes better to sell anything, even at no profit, than to cede the market to your competitors and watch your partners and the market move over to the opposition. Nvidia is spending money in the short term to try and make money down the line.

You can't backtrack from the retail price of 470/480 (which we know is under pressure because the performance of 470/480 is not much better than ATIs second and third tier products), work out how good a yield they would need to break even, and then claim that number must be the yield. Business, the market, yields, and pricing don't work that way.
 
Paying for it gives you breathing space in the short term while you sort out your yields, keeps your marketing rolling, leaves your partners with something to sell. At some point it becomes better to sell anything, even at no profit, than to cede the market to your competitors and watch your partners and the market move over to the opposition. Nvidia is spending money in the short term to try and make money down the line.
The thing is, though, there's almost no lock-in for GPU's, and game developers don't pay license money to IHV's. Let's take a scenario, for example, where we have the following situation:

1. nVidia has a chip, but low yields. Can sell chips at some loss.
2. Three months later, ATI releases a product that is better in every way (price, performance, power, features, etc.).
3. Three months after that, yields on nVidia's original chip are finally to the point that they could have sold it at a profit against ATI's part, but now they have to lower their price and continue making a loss for a while longer.

Or, this situation:
1. nVidia has a chip, but low yields. Can sell chips at some loss.
2. ATI has no new parts.
3. Six months later, yields on nVidia's original chips are finally to the point that they can sell it at a profit. But now they have to make up for six months of losses.

In other words, selling chips at a loss in the current GPU market is basically lose-lose situation. You can make the money back if you're selling a closed platform, like Sony did with the PS3, because there they earn most of their money on game sales (some portion of console game sales goes to the console maker). There's also a good amount of "lock-in" for the next generation because only PS3's are backwards-compatible with PS2's, X-Box 360's with the original X-Box, and so on. By contrast, ATI and nVidia hardware are, to a large degree, just as compatible with respect to one another.
 
Who says Nvidia is selling at a profit? You're making a lot of assumptions there, and it wouldn't be the first time they spent money to keep marketshare.

And aren't you making the exact opposite ? Assuming that they aren't even making a profit ?
Also, by the "they spent money to keep marketshare", I'm guessing you are referring to the GT200 vs RV770" era, but those "assumptions" of yours, weren't really backed up by the actual financials were they ?

That they didn't make as much money on each GT200 chip, when compared to RV770, there's no denying, but that you assume that there was no profit, is assuming way too much don't you think ? (curiously the same that you accuse someone else of doing)
 
In other words, selling chips at a loss in the current GPU market is basically lose-lose situation. You can make the money back if you're selling a closed platform, like Sony did with the PS3, because there they earn most of their money on game sales (some portion of console game sales goes to the console maker). There's also a good amount of "lock-in" for the next generation because only PS3's are backwards-compatible with PS2's, X-Box 360's with the original X-Box, and so on. By contrast, ATI and nVidia hardware are, to a large degree, just as compatible with respect to one another.


Again, you're just making guesses and still not addressing the point that you can't work backwards from a retail price, assume a given profit margin, and assume the yields from that. You can do all the handwaving you like, but you know business makes what the market will pay, not what they want to get. Even based on your assumptions, breaking even on chip manufacture is actually losing money on a product overall.

In case you hadn't noticed, Fermi is very late to market. Previous generation high-end Nvidia products were all but EOLed before Christmas. Their mainstream products are a rebadge of a rebadge, and are not DX11. ATI has had a full range of products on the go for the last half year.

If there's any time Nvidia's going to take some short term financial pain to get their faces back into the market, it's going to be now. It may not be financially desirable, but they obviously consider it to be worth the cost at this time instead of leaving the market to ATI until the end of the year and a B1 spin of Fermi that has little life because it runs into ATI's refresh/next gen products.

Maybe they think it's preferable to get something out of Fermi, than have it blown away by ATI's next product at the end of the year by the time Fermi gets fixed.
 
:D
I was making a funny

And while were at it, the graphics performance wouldn't be an issue for the blind, and the price wouldn't be an issue for the rich! I could go on all day!!.

Yeah I figured :) Was a great comment.
 
Again, you're just making guesses and still not addressing the point that you can't work backwards from a retail price, assume a given profit margin, and assume the yields from that. You can do all the handwaving you like, but you know business makes what the market will pay, not what they want to get. Even based on your assumptions, breaking even on chip manufacture is actually losing money on a product overall.

In case you hadn't noticed, Fermi is very late to market. Previous generation high-end Nvidia products were all but EOLed before Christmas. Their mainstream products are a rebadge of a rebadge, and are not DX11. ATI has had a full range of products on the go for the last half year.

If there's any time Nvidia's going to take some short term financial pain to get their faces back into the market, it's going to be now. It may not be financially desirable, but they obviously consider it to be worth the cost at this time instead of leaving the market to ATI until the end of the year and a B1 spin of Fermi that has little life because it runs into ATI's refresh/next gen products.

Maybe they think it's preferable to get something out of Fermi, than have it blown away by ATI's next product at the end of the year by the time Fermi gets fixed.
You're not addressing my central complaint to your assertions: What profit can be earned by selling at a loss?
 
Just wanted to throw this out so I was being called out. I picked a random review (Firingsquad) and after running the numbers throw, results were:

GTX480 54% faster than GTX285
5870 60% faster than 4890

By the same yardstick that Cypress was concluded to be "meh", GF100 is less than even a "meh", "WTF" territory since its less :!: Unless the obvious double standards show up again. :devilish:

You are forgetting something here. AFAIK Cypress hit the target clocks and has all SIMD enabled. GF100 didnt hit the target clocks, has one cluster disabled and is perhaps missing 64 TMUs. Pack all that back into the package and probably the number would be much higher than that.

You can say its too ambitious for the node process it was done on, no questions asked about that. But you cant deny the desired objective was higher than that. Meanwhile Cypress fullfilled all it was intended to fulfill, at the right time, and still it was "only" 60% faster.
 
You're not addressing my central complaint to your assertions: What profit can be earned by selling at a loss?

Have you not heard of loss leading or selling at a loss to gain market share (which you then monetize down the line)? These are both common techniques used in business.

You still haven't addressed my original complaint to your assertions: You cannot calculate yields based on retail price and assumptions of profit margins on individual products.
 
Also, there is the cost of the R & D to cover.

I am sure even Cyprus is technically selling at loss until enough of them are sold to cover the initial development costs. Every Fermi sold contributes to that cost.
 
What's the point of marketshare if not to earn profit?

It is easier to compete when you have more market share, even if you are paying to keep that market share.

Ask MS/Intel. They have a _long_ history of buying market share to prevent newer competitors from eating their lunch.

Besides, the yield issue shouldn't take long to solve. If charlie's is anywhere close to accurate on his yield claims then we'll see it turning up in finance books/availaibility.
 
Have you not heard of loss leading or selling at a loss to gain market share (which you then monetize down the line)? These are both common techniques used in business.
I see no reason to believe that either is likely to get you anywhere in the extremely competitive and highly mobile GPU business.

You still haven't addressed my original complaint to your assertions: You cannot calculate yields based on retail price and assumptions of profit margins on individual products.
I didn't. I was just showing how profit-wise, they're somewhat reasonable.
 
Back
Top