GF100 evaluation thread

Whatddya think?

  • Yay! for both

    Votes: 13 6.5%
  • 480 roxxx, 470 is ok-ok

    Votes: 10 5.0%
  • Meh for both

    Votes: 98 49.2%
  • 480's ok, 470 suxx

    Votes: 20 10.1%
  • WTF for both

    Votes: 58 29.1%

  • Total voters
    199
  • Poll closed .
Ask MS/Intel. They have a _long_ history of buying market share to prevent newer competitors from eating their lunch.
But there's also one heck of a lot more "lock-in" for CPU's than there is for GPU's. The GPU market changes at lightning pace compared to the CPU market.
 
I see no reason to believe that either is likely to get you anywhere in the extremely competitive and highly mobile GPU business.

I didn't. I was just showing how profit-wise, they're somewhat reasonable.

Ahh, so although you were showing numbers, you were really telling us about your "beliefs". That's what I thought.

The barrier for entry into the GPU business is very high - you can't build up from nothing. The last thing any company like Nvidia would want is to have a year of losing market share and losing partners/OEM contracts, and that's worth spending money on, at least in the short term, to save themselves pain further down the line.
 
You're not addressing my central complaint to your assertions: What profit can be earned by selling at a loss?

All of this assumes a particular scenario, which we don't know atm one way or the other. .

The possibility of making a profit tomorrow is much larger if nv is prepared to make a loss today (for a short term, on a niche product). What do you think is gonna happen if they refuse to fight until their B1 or GF104 is ready? The fab capacity/yields are rising all the time. Besides, the more time ATi has all the market to itself, more money they can make, and throw more money into their next gen parts and they'll have an even bigger hill to climb.

Sometimes, kamikaze can be a "good enough in short term" strategy as well. Don't forget that these things are priced so high that the losses will be effectively limited by the volumes. If A3 is a dead duck, then just dump the whole thing in the market and move onto Gf104/B1. That's what happened with nv30. The wafers/PCB's/RAM has been bough and paid for. Why not recover some money atleast?

This is not as ridiculous as it may seem. AMD made massive losses (for years) but didn't abandon the market totally.
 
Ahh, so although you were showing numbers, you were really telling us about your "beliefs". That's what I thought.
Well, as I said when I first waded into this, the 70% number wasn't one I came up with myself. I heard it through the grapevine.

The barrier for entry into the GPU business is very high - you can't build up from nothing.
And that would make a difference if ATI wasn't already on close to even footing with nVidia.
 
The barrier for entry into the GPU business is very high

But Nvidia losing share to AMD wouldn't lower those barriers for a new entrant.

Then you'll have to explain why marketshare in the GPU market is vastly more volatile than the CPU market.

Intel owns the ecosystem through the CPU, chipsets and motherboards they produce and through their pricing power and higher volumes due to more advanced manufacturing. Nvidia is in no way comparable, they still have to beg for a spot in the PCIe slot.
 
Well, as I said when I first waded into this, the 70% number wasn't one I came up with myself. I heard it through the grapevine.

Everyone else's rumours give a much, much lower figure.

And that would make a difference if ATI wasn't already on close to even footing with nVidia.

So why would Nvidia allow their position to deteriorate against ATI under any circumstances? That's worth spending some of their cash reserves.
 
If you already have the sunk costs then it can be a question of a big or small loss. A small loss is certainly a profit over a big loss.

If you take Charlie's "risk wafer" story from last year, they've already spent that $50Million, it's written off, they're just recouping whatever costs they've made.
 
If you take Charlie's "risk wafer" story from last year, they've already spent that $50Million, it's written off, they're just recouping whatever costs they've made.

Which was my point, and Charlie whether you love him or hate him, has at least seemed to have very good sources within Nvidia for the past year or so.
 
Then you'll have to explain why marketshare in the GPU market is vastly more volatile than the CPU market.

Resources. Intel has more MUCH more resources and has historically not allowed AMD any opening, except for the P4 debacle. That was when AMD did best and Intel played the most dirty. Intel's execution has been noted even outside semi industry circles. Think NV30/R600.

They have a process advantage that nobody in the entire semiconductor industry can match. (Or has matched in last ~15 years). All GPU's are fabbed in TSMC.

They have a cache density which is better than anyone else's (except edram, ofc. But that's a pie in the sky for gpu's). GPU's use standard mem cells.

They have enough employees to do a new uarch every 2 years while AMD can do 1 every 4. GPU's have evolved from ff hw, so there was not much of an uarch to innovate upon as such in the beginning.

CPU bug fixes are expensive as hell. FDIV/Barcelona TLB. GPU bugfixes (a la R600's AA) are patched in driver.

AMD will be lucky to gain *any* marketshare from Intel of today. GPU marketshare has never (to my knowledge) been as skewed as 80:20, let alone 90:10.

AMD has never sold >$250 cpu's (if you exclude the athlon hey-days). AMD served cake when NV surrendered >$250 gpu market.
 
Last edited by a moderator:
dabs (sort of the british newegg) have the gtx480 priced at £446.50 thats $673 f*** me
and the gtx460 priced at £305 or $460

5870's are between £305 ($460) and £352 ($531)

most 480's at newegg are $499 £330

In other words we brits are being overcharged $174 (£115)
 
Resources. Intel has more MUCH more resources and has historically not allowed AMD any opening, except for the P4 debacle. That was when AMD did best and Intel played the most dirty. Intel's execution has been noted even outside semi industry circles. Think NV30/R600.

They have a process advantage that nobody in the entire semiconductor industry can match. (Or has matched in last ~15 years). All GPU's are fabbed in TSMC.

They have a cache density which is better than anyone else's (except edram, ofc. But that's a pie in the sky for gpu's). GPU's use standard mem cells.

They have enough employees to do a new uarch every 2 years while AMD can do 1 every 4. GPU's have evolved from ff hw, so there was not much of an uarch to innovate upon as such in the beginning.

CPU bug fixes are expensive as hell. FDIV/Barcelona TLB. GPU bugfixes (a la R600's AA) are patched in driver.

AMD will be lucky to gain *any* marketshare from Intel of today. GPU marketshare has never (to my knowledge) been as skewed as 80:20, let alone 90:10.

AMD has never sold >$250 cpu's (if you exclude the athlon hey-days). AMD served cake when NV surrendered >$250 gpu market.
That's all well and good, but now you've got to explain how losing money is going to net NV the same sort of situation. Because they are certainly not in that situation now. Over the years, in the GPU market we've seen things swing both ways, and rather quickly at that. Until one IHV completely dominates the market for a significant period of time, that isn't going to change.
 
You're not addressing my central complaint to your assertions: What profit can be earned by selling at a loss?
Why its so strange to you? As others already mentioned, you can price what market pays you for, and even if its selling at a loss short term, in the long run it can pay off (much better than not launching Fermi even now, and waiting for B1,2, etc).

Lets take for example consoles - both Sony and MS started selling PS3/XBOX360 at a loss, guess why? Another hint - as JHH said, nVidia is software company :D
 
Lets take for example consoles - both Sony and MS started selling PS3/XBOX360 at a loss, guess why?
Because Sony and MS actually make money on that same sale further down the road because of licence fees?

Nvidia and ATI don't have a loss leader business model because there is nothing to left sell to the same customer after the initial sale.
 
Which was my point, and Charlie whether you love him or hate him, has at least seemed to have very good sources within Nvidia for the past year or so.
Indeed, strangely enough Charlie sources were more reliable than GPU details which Rys got from Nvidia lead architect :oops: That says a thing or two, and as much as Nvidia fans hate Charlie (I take his words with a grain of salt as well), its a fact he was more accurate about Fermi journey than any other journalist, and he was way ahead of time as well.
 
Back
Top