NVIDIA achieves record revenue and 37.84% Gross Margins

Geo, from Beyond3D's "3D Chip Chart":
G70: 334mm²
NV40: 287mm²
R420: 281mm²
We don't even know R520's transistor count, and the diesize seems like even more of a mystery. Considering the publicly available information, lots of things are possible; but from what I've heard so far, I'd say most of the good ones are quite unlikely at this point. It's sad, really, considering the G70 didn't seem like such a hard part to counter to me; just a nice speedbump.

Uttar
 
Uttar, I'm pretty sure that Wavey would agree his measurements are approximations. Even Orton agrees in the interview that NV40 is 10-15% bigger than R420.

He also refers to R420 as a 16x16 die, giving 256mm2

http://www.beyond3d.com/interviews/daveorton/index.php?p=3

Edit: As for R520, I'm still relying on the Goldman analyst, which is pretty much all we have from a credible source thus far.
 
Last edited by a moderator:
geo said:
Uttar, I'm pretty sure that Wavey would agree his measurements are approximations. Even Orton agrees in the interview that NV40 is 10-15% bigger than R420.
http://www.beyond3d.com/interviews/daveorton/index.php?p=3
Oh yes, I most definitively know that, sorry I should have been clearer :) My point was just to emphasize the fact that the diesize difference is nowhere near the transistor count difference because of different design techniques (222/160 = 38.75%, which is still a fair bit below 15%).
So I don't know if I personally consider NVIDIA that much more aggressive in that timeframe; I wouldn't be surprised if part of that transistor count difference was redundacy for yields on the NV4x, although obviously R4xx has a certain amount of redundancy too.

Uttar
 
Uttar said:
I wouldn't be surprised if part of that transistor count difference was redundacy for yields on the NV4x, although obviously R4xx has a certain amount of redundancy too.

Uttar

Me neither. I've been convinced that there's something going on there redundancy-wise with NV for awhile, and I think whatever it is, it is more complex than the way we usually think about it.
 
geo said:
Uttar, I'm pretty sure that Wavey would agree his measurements are approximations. Even Orton agrees in the interview that NV40 is 10-15% bigger than R420.

He also refers to R420 as a 16x16 die, giving 256mm2.

Wavey's G70 die size approximation is reasonably accurate, at least according to a recent writeup from the Microprocessor Report.

"The complexity of the chip is quite amazing, with a total of 302 million transistors (a 36% increase from the GF6800).With the complexity of the design and the use of a more conservative process, Nvidia has taken the penalty in a larger die. And it is quite large. After evaluating the 300mm wafer photo (see Figure 2 wafer photo) provided by the company and handling the packaged part, we have concluded that the die size is roughly 350mm2 (18.7mm/side). Counting the gross die per 300mm wafer, we found roughly 166 possible die sites. In contrast, the smaller Cell processor, at 237mm2, has about 257 possible die sites per 300mm wafer. Using the In-Stat (previously MDR) Cost Model, we estimate the number of net good die per wafer at 41 and the total manufacturing cost at $175. The chip will be a significant part of the cost of the graphics boards that sold at launch for about $600. Even if Nvidia performs a die shrink to 90nm, the resulting die will still be a hefty 250mm2."
 
Nice find, kemosabe.

Thinking about your last some more, Uttar, I'm not sure I agree that redundancy for yields isn't just another angle on high-end performance. You have to have enuf of these things to sell at a price you can make a profit at --something NV has done a much better job of the last year.
 
ChrisRay said:
Oh. I'd have to disagree. ATI sold alot more R300 chips than Nvidia sold NV3x chips. Nvidia took alot of damage back in the Nv3x era. And ATI earned alot mindset.

Agreed. The entire nV3x saga cost nVidia the top spot in the market, cost them the x-box, and ate up a ton of whatever good will the company had accumulated. Ati not only surpassed nV in mindshare at the time, but also in marketshare--marketshare nV is furiously trying to recover even now. As JHH put it so well, "nV30 was a mistake." Amen...;) It's too bad it took them a couple of years to figure that out, though, as I believe had they realized the mistake much sooner their damage would have been far less. I'm glad to see that nV isn't completely shattered, though, as competition is always a good thing. Uttar's report, while encouraging, is certainly not the watershed some think it is, however. ATi would have to make the same kind, type, and degree of mistakes, over the same span of years, that nVidia made with nV3x, and I consider that highly unlikely--although surely not impossible...;)
 
Uttar said:
Didn't seem too worried at all when it comes to the competition, laughed at the way the questions were presented when it comes to competitive pressure.
Well it seems that ATI will cream them in the next cycle then. I hope that attitude does not percolate down or else we will end up in the nv30 era again with choice A now or later.
 
Sinistar said:
See what you started....

I'm already kicking myself in the ass, don't need your help. :LOL: Quick, somebody lock it and point the finger of shame at me. . .
 
Uttar said:
....
- Didn't seem too worried at all when it comes to the competition, laughed at the way the questions were presented when it comes to competitive pressure.

Thanks for causing me to go back and look at this, Sxotty--I agree with you. Maybe Uttar is just putting his own interpretation on it--but somehow I doubt he'd be inclined to exaggerate something like this. If anything, I'd expect him to downplay it (sorry, Uttar...;)) But maybe he played it straight through as it was.

This hits a nerve with me, and is one of the aspects of the nV company management that I really don't like. nV seems to have no compunction about reordering reality to fit some kind of fantasy pitcure, both internally, and for investors, obviously. I mean, does the top brass at nV really think that if they keep stating over and over again that they have no competition that eventually it will come true, or that investors will never discover just how much competition nV's got? Certainly seems that it's one or the other, if Uttar's observations are accurate.

It reminds me of the scuttlebutt I was hearing from various internal personnel within nVidia pre-R3x0--that "in five years we'll be bigger than Intel," etc. Never happened, of course--and never will, imo. But still, this kind of public reaction to vigorous competitive pressure, the "we have no competition because we say we don't" syndrome, strikes me as indicative of nVidia's Achille's Heel--a kind of lopsided hubris that twists reality all out of proportion. Laughing in public at the very mention of competitive pressure seems to indicate at the least a very jittery, nervous executive branch in the company, and at the most a management completely disconnected from reality.

I mean, even giant Intel doesn't present a smarmy, soporific view of AMD these days in public, and there's far more difference in relative size between Intel and AMD than there is between nVidia and ATi. In fact, by some measures ATi is the "larger" company at the moment. I think some folks within nV have yet to learn the world isn't mostly populated by utter fools who'll believe everything they're told, and to me that's the scariest nVidia dimension of them all.
 
Last edited by a moderator:
YeuEmMaiMai said:
Their arrogance will lead to their downfall. It's jsut a matter of time.

Well, I really hope that never becomes the case. Competition is good and the lack of it isn't. That's exactly why I'm so hard on nVidia. I have absolutely no doubt that the company can do better. I've believed for a long time that there's something awry within the top management of the company, though it is difficult just from gleaning through their multitudinous public statements to know precisely where the faults lie--I suspect they originate from the top down--but of course I do not know this.

I said in an earlier post in this thread that I do not think it impossible for ATi to create it's own "nV3x" debacle, only that I think it is highly unlikely. The reason I state I do not think it is impossible is that prior to nV3x I thought and considered it impossible for nVidia--for any company, really--to make the kind, type, and degree of mistakes nVidia made with nV30 over such a protracted period of time. nVidia proved me wrong, and so I have never forgotten it. I'd like to see them do much better, and it strikes me that laughing in public forums about the competition they most certainly have is exactly the wrong way to go about it.
 
Walt by almost any measure ATI is a bigger company and has been. You may be hard on Nvidia, but it wouldn't hurt to be hard on ATI, perhaps you are just more distraught by the failings of Nvidia than those of ATI, but I am distraught by both of their failings. I want them both to produce the absolute best products possible. Certain products I have gotten recently are not as good as I would have hoped and I sincerely desire for the company in question to do a better job next time. The reason I am not saying what exactly is that I am not 100% certain it is their fault, but it certainly is not impressing me.
@ geo, don't worry mate, I won't say anything to rude or get out of control. In fact I will probably let it die a peaceful death of obscurity.
 
Maybe I'm blind, but I don't see them laughing on public forums. People with nothing better to do might take it to public forums for debate, but the company itself is rarely the instigator or on the list of protagonists when the debate gets going.

Recent fodder for that has also gone both ways, both externally and internally, and in many respects both big IHVs are as bad as each other. Someone wise once told me that the 3D industry has never grown up. It's so true it's unreal, and ATI are massively to blame for that, just as much as NVIDIA are.

It's an exercise in futility to constantly remind people of the past, Walt, since all you're doing is perpetuating the toys being thrown from the pram. If there's anything that's not needed in the 3D game, it's that.

You may well believe you're treading just, righteous and wanted ground with what you do, but when you refuse to afford the same anger and vitriol in the other direction it just makes you look foolish. Maybe you're just not privvy to the volume of shit being thrown from red to green, but I don't believe you're that blind or that stupid.

Take the apparent blinkers off and maybe have a go at analysing the other side's blatant public problems, with the same intensity and purpose you afford NVIDIA's bad bits. See what you dig up for us. It'll serve you well.
 
Walt,

No one wants to see a vendor go down but if they are so arrogrant to think that they have no competitionj then it might not hurt. No one wants a company that does not give what the consumer askes for. If Nvidia does go down, there will be someone to take their place.

Look at 3dfx they were arrogant thinking they were the king (they were) and they totally lost it.
 
Uttar said:
My point was just to emphasize the fact that the diesize difference is nowhere near the transistor count difference because of different design techniques (222/160 = 38.75%, which is still a fair bit below 15%).
So I don't know if I personally consider NVIDIA that much more aggressive in that timeframe; I wouldn't be surprised if part of that transistor count difference was redundacy for yields on the NV4x, although obviously R4xx has a certain amount of redundancy too.
Sorry for continuing to bump this old thread (but, on the plus side, I get to give Uttar rep pts for his work), but I thought the transistor count point was interesting in light of the NV42/6800GS, which reportedly has a higher transistor count than the identically-outfitted NV41/6800:
Because the two chips are manufactured using different fab processes that rely on different libraries, NVIDIA estimates their transistor counts differently. Thus, the NV41 purportedly has 190 million transistors, while the NV42 has 202 million transistors, although the two chips "are fundamentally the same architecture," according to NVIDIA.
So, two chips, both with a reported 12 pipes, 8 ROPs, and 5 vertex shaders, have a 10M transistor differential.

TBH, I'm not sure if I'm noting this b/c it's interesting or simply b/c it's something I'll keep in mind next time I look at transistor counts.
 
I think there is far too much emphasis on this forum on profits and share price in this forum but of course that is because neither interest me at all. Profits for both have been up and down in recent years but they still chunk out impressive video cards when they get it right.

I still think both have got it wrong, at least up to now, with not offering an AGP version of 7800/X1800, I would say there are quite a few people waiting on their old Intel AGP systems who do not want to go to AMD because Intel's new processors based on Pentium M are out in 2006 ..ie Conroe.

Maybe I should do a poll. Never done one, probably go horribly wrong and have just me vote and 300 replies saying that this is a stupid poll. :) Am I feeling brave?
 
Back
Top