GF100 evaluation thread

Whatddya think?

  • Yay! for both

    Votes: 13 6.5%
  • 480 roxxx, 470 is ok-ok

    Votes: 10 5.0%
  • Meh for both

    Votes: 98 49.2%
  • 480's ok, 470 suxx

    Votes: 20 10.1%
  • WTF for both

    Votes: 58 29.1%

  • Total voters
    199
  • Poll closed .
And me. It seems the most popular criticism is power consumption - important but not at the top of my list. If you were waiting on it to beat up on Cypress it's definitely a meh. If like me you were waiting for a proper DX11 card from Nvidia that could beat up on GT200 you got what you wanted.
:LOL:

Funny, you found Cypress to be MEH and this one to be YAY. Just stating the obvious here.
 
The 480 would need to be barely on par with the 5850 with AA enabled and have no advantages in DX11. That's where the 2900 XT ended up next to the 8800GTS-640 IIRC.
Ah well, at least HD2900XT wasn't ~40-60% bigger than G80.

Anyone seen an "official" figure for GF100's die size, by the way? Anyone taken the lid off to measure?

Jawed
 
Ah well, at least HD2900XT wasn't ~40-60% bigger than G80.

Anyone seen an "official" figure for GF100's die size, by the way? Anyone taken the lid off to measure?

Jawed

The ones with the "GT215" marker pen inscriptions were GF100 right?

Although it's not a caliper shot and you pretty much wanted that I guess.
 
I don't know. I've checked some pre-order sites and some in Germany and The Netherlands (Komplett) now state May 7th for a launch and 36 month warranty.

May 7th? Thats quite a distance back really. I thought it was supposed to come in late March/early April? If thats true then it does make it quite the paper launch.

As far as warranties are concerned, I don't think most enthusiasts will keep the card until out of warranty so I doubt its an out of pocket concern, but for 2nd hand buyers and those who keep them longer in their 2nd machines and people who use them for say folding@home and equivalent it may be troublesome. I am speculating however as we won't have any data on the reliability of these cards until 6 months after launch which will be too late for most people.
 
Ah well, at least HD2900XT wasn't ~40-60% bigger than G80.

Anyone seen an "official" figure for GF100's die size, by the way? Anyone taken the lid off to measure?

Jawed

I'm still waiting for that one, with everyone claiming 250W for it and a 20x21 die size, I'm wondering if it's a little more near Charlie's numbers (and mine) than the other way around since the first was already wrong..
 
The chance of AMD coming with a faster part in the near future is there. I don't think NV would like to see AMD release a refresh before the GTX480 actually sells and they'll have to play second fiddle for another six months.


That still doesn't make any sense. It's still better to sell the B1 than to delay even further and ship a new architecture. The only way killing the B1 would be a win is if NVidia had a Fermi 2.0/58x chip already ready to tape out. Otherwise, killing the B1 would simply insure that NVidia has no products on the market until 2011.

I can't see any scenario where killing the B1 somehow results in NVidia shipping a new architecture in less than 6 months.

The best way I can put this is, in the worst case, it's better to play second fiddle than to have no fiddle to play at all.
 
I certainly remember a lot of sites saying it would be smaller than GT200b, but that was months ago.
 
I chose "WTF on both"
Why?
Well, the performance isn't bad by any definition, the price... well, it's not TOO far off, but the power consumption is just plain and simple ridicilous
 
I think if you already own a card that is 6-12mos old, they're no practical reason to desire Fermi, I mean, even if it delivered say, 30% better performance at the same power, chances are, it's not going to matter much in practice. So if you've already got a current gfx card, 'meh' is pretty much a given, and I think, a vote of somewhat dubious value. The only people who upgrade otherwise are people who have an emotional need to simply own the newest best thing.


The audience for something like Fermi is people who haven't bought a new card recently, that is, someone actually in the market for a new card.
 
Can you give us a hint as to where the missing 32 CUDA cores are?

You heard about the BIOS update Nvidia issued didn't ya? That BIOS update propably disables those 2 cores, lower power consumption and maybe raised clocks a bit. I'd like for someone to release the original BIOS that was on these cards before that update was issued so we could see the differences between the two.
 
"Meh" for me. The redesign's performance actually sucks more than I thought it would. But in the end they discovered higher core speeds yielded better overall gaming #'s than 32 additional "CUDA cores". It was a bonus because that decision probably increased producible quantities for the 480 sku. On the flip side, it evolved into a space heater, and not figuratively. 0.99 vGpu? Am I the only one who thinks this thing would run cooler without the heat spreader? I do think it has potential regarding the dx9 numbers for Dice & Dunia. Efficiency is a bust too. "Meh."
 
If they would've released it back in September'09, can you imagine how much more of "teh sucks" would be attributed to it only being "Beta" drivers.

I'm not sure I've heard that argument much last year, but now it seems that a new architecture which has had driver development almost as long as Evergreen is suddenly excused.

I'm pretty sure that the release drivers will have adjusted fan/power profiles so the cards will behave quite differently in consumer hands.

I'm not sure how much they CAN tune that.

They've already lowered fan speed as much as possible without letting the card idle over 100c. And it still sounds like a dustbuster.

And that's in a well ventilated case or open air environment. In other circumstances it may well end up idling over 100c.

Regards,
SB
 
Last edited by a moderator:
You heard about the BIOS update Nvidia issued didn't ya? That BIOS update propably disables those 2 cores, lower power consumption and maybe raised clocks a bit. I'd like for someone to release the original BIOS that was on these cards before that update was issued so we could see the differences between the two.

In other words, you were wrong and you're grasping at straws to avoid admitting it.
 
In other words, you were wrong and you're grasping at straws to avoid admitting it.

It's pretty obvious that NVIDIA originally intended the highest bin to have all 512 SPs enabled. In other words, XMAN had information that was originally correct, it just became outdated at some point. That kind of thing happens all the time...
 
Though I think I saw a murmer about the AF hit being worse than GT200, somewhere on some random page in some random review.
You mean this one? http://www.computerbase.de/artikel/...geforce_gtx_480/5/#abschnitt_skalierungstests

Anyone seen an "official" figure for GF100's die size, by the way? Anyone taken the lid off to measure?
I think some reviews quoted the 529mm^2 figure, originating from here I guess:
http://www.nordichardware.com/en/news/71-graphics/10909-geforce-fermi-die-measures-529-mm2-.html

Actually, I think I need to revisit my opinion about the fermi architecture a bit. I think it's got potential, and it in fact did improve in terms of perf/area vs. competition quite a bit (now I'd argue it's more due to bad scaling from rv770 to rv870 but still), though not closing the gap. But the chip is just too flawed to take any advantage of this (way late, no full configuration, low (mem) clock, terrible power consumption).
Some random thoughts for that theory:
- GTX285 and GTX480 have about a similar lead over the top (single) AMD cards on average (HD4890 and HD5870 respectively) - though GTX480 is very close in a lot of titles, never really loses by much and sometimes is quite a bit faster. And it would be better with full configuration, obviously.
- Die size difference is smaller in percentage (going by the 529mm^2 figure):
282mm^2 (rv790), 480mm^2 (gt200b, 70% larger) vs. 330mm^2 (rv870), 529mm^2 (gf100, 60% larger).

But the implementation of the chip is just too broken for the cards to really be considered good.
More random thoughts:
- AMD doubled its units, transistor count from HD4890 to HD5870 going from 55nm to 40nm, keeping clocks the same - and more importantly, (load) power draw is pretty much the same too.
- NVIDIA basically doubled its units (well not exactly but you get the idea) and transistor count from GTX285 to GTX480 going from 55nm to 40nm, keeping clocks (which were already a lot lower than on g92b) roughly the same (bit lower actually) - and nearly doubling power draw.

So I have to agree with Jawed, might be better to judge the architecture based on Fermi derivatives (GF104) or at least that B1 respin. Though I think Juniper is actually a tougher opponent to beat in the perf/area department than Cypress, still things might not get too ugly for nvidia if those derivatives don't suffer from the same problems... Well if they appear before N.I that is...
 
Die size is 530mm2 according to muropaketti. I'm going to take his word so Rys was misled by his sources.
 
Back
Top