The Official NVIDIA G80 Architecture Thread

Err, 97.28? Is it there with 97.02? One gets what one deserves with funky fly by night "My buddy Jose said this is some good sh*t here, man" drivers.

Unwinder/Guru3D-Forum
All

I've just finished investigating the problem with constant 400MHz memory clock frequency reading on G80 boards, which doesn't reflect memory overclocking.

Status: not RivaTuner issue, NVIDIA driver's bug, two of 3 memory clock frequency generators are really running at 400MHz

G80 boards use 3 independent memory clock frequency generators (1 generator for each 2 memory channels). When the PC is booting, core/memory clocks are set to much lower clocks comparing to the clocks set in Windows GUI, and at that time BIOS programs all 3 memory generators to 400MHz. When the OS finish loading, driver must switch all 3 generators to target memory clock. 97.02 driver does it properly, 97.28 - doesn't and leaves 2 of 3 memory clock generators running at BIOS defined 400MHz clock.

I'd recommend G80 owners to rollback to official 97.02 drivers.

Errrr....no thank you :rolleyes:
 
Anyoen interested to see the actual G80 die?

Here it is!

coresize.jpg


With the courtesy of Vr-zone.

heres the link for more pics

http://sg.vr-zone.com/?i=4352
 
  • Like
Reactions: Geo
Good grief! That is enormous!

But its size is comparable to that of the recently leaked die shot of the R600 and that one is on 80nm comparison to 90nm of the G80. If this area can pack with 700mil transistors, and with the same density... R600 on 80nm would rather exceed 700mil too :rolleyes:
 
don't forget NVIO ...
You'd expect NVIO to have much lower density though; so if it was integrated on the main chip, it certainly wouldn't take 49mm2. Now, how much it would take, I don't know :)


Uttar
 
man who would have thought they would have doubled chip size or actually 2.5 times in one gen, nV really has to down size this chip!, it must be costing and arm and a leg, even if yeilds are good.
 
man who would have thought they would have doubled chip size or actually 2.5 times in one gen, nV really has to down size this chip!, it must be costing and arm and a leg, even if yeilds are good.
I'm sure NVIDIA is very happy you care so much about their bottom line, but sorry, they're making good money with it ;)

A simple calculation will tell you as much, too, if you don't believe me. Look at my diagram and you'll notice I put 80 dies/wafer are functional (for the GTS and GTX), which is a number NVIDIA gave us. I've estimated the NVIO chip costs at about $2, increase that by a buck or two for the increase in board costs. G80, if you take a 300mm wafer cost of $8K, you get chip costs of $100. Add packaging/test/etc. costs to that.

Do you honestly think NVIDIA sells those chips below $200? Furthermore, what's their corporate margin? You can easily see their G80 margins are ABOVE their corporate average (but then again, the same is true of the 7600GS, you could argue!)

The next logical step is a 80nm shrink, and the wafer costs are the same for 80GT as for 90GT afaik. So, let's say yields are slightly lower so that they'd only get ~75 chips/wafer, if you take a scaling factor of 20%, you'll get up to ~95 usable chips per wafer, so you cut costs by more than 15% already. I think NVIDIA will do just fine with these margins for now ;)

Anyway, we'll see if the low/mid-end parts are more efficient per mm2. IMO, G8x is already pretty much as efficient per mm2 as G7x, which is fairly impressive; given that they only get a 20% shrink with 80nm though, if they want the same margins, they'd currently only have a relatively small performance boost. We'll see if they improve things or not for the 80nm and 65nm refreshes.


Uttar
P.S.: I'm not sure why you didn't believe us when we originally said it was 21.5x22.5mm - did you think we were inventing numbers, or? ... (we didn't have the precise numbers, but we did the reverse calculations given the dies/wafer and the width/height ratio etc.)
 
didn't mean that way Uttar, they are still geting thier 20% for their bottem line with these cards for sure, specially since there really isn't nothing to compete with them the price will ramain where nV has set it, but I do think if they don't go to 80 or possible 65 nm, they won't be getting record revenues like we saw the past two quarters, growth wise in thier targeted market sector for these cards (high end) there really isn't much more to gain, so they might gain a little higher more revenue there but not much. There is more room to grow in the lower and mid range segment though, but this will be highly dependent on what ATi's timing will be in these segements.

I did believe ya about the die size, I was being silly :p
 
Last edited by a moderator:
didn't mean that way Uttar, they are still geting thier 20% for their bottem line with these cards for sure
20%? Not sure if you're talking of margins there, but their margins on G80 most likely are 50%+, imo.
but I do think if they don't go to 80 or possible 65 nm
G81 and G84 are 80nm. You'd expect G80 to have been NVIDIA's last 90nm part.
they won't be getting record revenues like we saw the past two quarters
Uhm, what? Revenues and margins are two separate things. Unless their parts are less competitive because of this and they have to lower the price point accordingly, I don't see your point. If you sell a chip at $50 and make $2 of profit, your revenue would be the same as if you sold the chip at $50 with a profit of $45.
growth wise in thier targeted market sector for these cards (high end) there really isn't much more to gain
It's all about the mindshare in the ultra-high-end segment, and to a lesser extend margins. This was more the case back in the days when margins were 20-25% in the mid-end, so 60% margins on the high-end was a great thing, financially speaking; nowadays, when it's more like 40% and 55% respectively, it matters a fair bit less. So the ultra-high-end is all about mindshare.

It's easy to see that AMD would be ready to lose money for every R600 sold, if that allowed them to have a ridiculously beefy chip at the same pricepoint as G80. Given how important it is for them to make people realize they truly care about the high-end, and not just about shitty IGP equivalents, I'd in fact be very surprised if AMD cared if their R600 margins were higher than, say, 0% ;) (although obviously, they might still prefer 15-20% margins or something... but given the low volume, my point is they'd probably prefer mindshare and marketshare to margins there - but the same is true for NVIDIA, of course, up to a certain extend)
There is more room to grow in the lower and mid range segment though, but this will be highly dependent on what ATi's timing will be in these segements.
Obviously. There, unlike in the high-end market, they'll be forced to have good margins, which forces them to have comparable perf/mm2 and, ideally, perf/watt for the laptop market. If they manage that, they might just manage a homerun. Otherwise, it'll be more like a rerun of last year.


Uttar
 
I was just giving a baseline figure what the minimum would be.

For the the gtx yeah 50% looks about right, but for the gts its more like 35%, I really think there will stiff competition with the r600, that was why I mentioned revenues (sorry bad english in the last post), I really think the r600 is going to be a solid chip, and the only thing that will hurt it is the delay, I'm pretty sure its performance is as good as the g80. I don't know or actually don't think AMD will drop prices too quickly, they are in a situation, they have to be looking at their bottemline for the next few quarters.

If nV gets out thier top to bottom on .80 and .65 by end of this coming year, yeah they will be in a much better situtation then AMD on the techonolgy front, but right now, they have the lead in getting thier products out first, but because of Vista its a guessing game, sales have been slow. This thought might change once we know how the r600 performs and power usage of course, the g80 performance per watt is amazing I agree, but again, there could be competition here, just don't know yet :).
 
High-end margins still help to pay the freight for the rest of the lineup, so even if there are other factors that would make AMD relatively less margin-conscious than NV, I'm certainly not expecting 0% margins are on the table. :LOL: Executives like bonuses too, y'know. Divisions like to show they can pull their weight in the team effort. Besides, I'm not sure that "AMD the sugar daddy" can impress anyone knowledgeable about their commitment to high-end ATI Radeon with R600. We all know it was largely through its development cycle before they bought the company.
 
This just came up in a conversation, and I couldn't remember if it'd appeared anywhere full-size. So, just in case, for those interested: G80 Core

Not that its any wildly informative pic, mind you. :)
 
This just came up in a conversation, and I couldn't remember if it'd appeared anywhere full-size. So, just in case, for those interested: G80 Core

Not that its any wildly informative pic, mind you. :)

I've seen this very same photo right here at B3D before, just can't remember where exactly...:???:
 
Argh -- those bumped pads are blinding the circuit details. :p

Here is a shot of a wafer edge, though, not giving more cleaner view.
 
  • Like
Reactions: Geo
Back
Top