GF100 evaluation thread

Discussion in 'Architecture and Products' started by rpg.314, Mar 27, 2010.

?

Whatddya think?

Poll closed Apr 6, 2010.
  1. Yay! for both

    13 vote(s)
    6.5%
  2. 480 roxxx, 470 is ok-ok

    10 vote(s)
    5.0%
  3. Meh for both

    98 vote(s)
    49.2%
  4. 480's ok, 470 suxx

    20 vote(s)
    10.1%
  5. WTF for both

    58 vote(s)
    29.1%
  1. Arty

    Arty KEPLER
    Veteran

    Joined:
    Jun 16, 2005
    Messages:
    1,906
    Likes Received:
    55
    :lol:

    Funny, you found Cypress to be MEH and this one to be YAY. Just stating the obvious here.
     
  2. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,716
    Likes Received:
    2,137
    Location:
    London
    Ah well, at least HD2900XT wasn't ~40-60% bigger than G80.

    Anyone seen an "official" figure for GF100's die size, by the way? Anyone taken the lid off to measure?

    Jawed
     
  3. Tchock

    Regular

    Joined:
    Mar 4, 2008
    Messages:
    849
    Likes Received:
    2
    Location:
    PVG
    The ones with the "GT215" marker pen inscriptions were GF100 right?

    Although it's not a caliper shot and you pretty much wanted that I guess.
     
  4. Squilliam

    Squilliam Beyond3d isn't defined yet
    Veteran

    Joined:
    Jan 11, 2008
    Messages:
    3,495
    Likes Received:
    114
    Location:
    New Zealand
    May 7th? Thats quite a distance back really. I thought it was supposed to come in late March/early April? If thats true then it does make it quite the paper launch.

    As far as warranties are concerned, I don't think most enthusiasts will keep the card until out of warranty so I doubt its an out of pocket concern, but for 2nd hand buyers and those who keep them longer in their 2nd machines and people who use them for say folding@home and equivalent it may be troublesome. I am speculating however as we won't have any data on the reliability of these cards until 6 months after launch which will be too late for most people.
     
  5. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    I'm still waiting for that one, with everyone claiming 250W for it and a 20x21 die size, I'm wondering if it's a little more near Charlie's numbers (and mine) than the other way around since the first was already wrong..
     
  6. DemoCoder

    Veteran

    Joined:
    Feb 9, 2002
    Messages:
    4,733
    Likes Received:
    81
    Location:
    California

    That still doesn't make any sense. It's still better to sell the B1 than to delay even further and ship a new architecture. The only way killing the B1 would be a win is if NVidia had a Fermi 2.0/58x chip already ready to tape out. Otherwise, killing the B1 would simply insure that NVidia has no products on the market until 2011.

    I can't see any scenario where killing the B1 somehow results in NVidia shipping a new architecture in less than 6 months.

    The best way I can put this is, in the worst case, it's better to play second fiddle than to have no fiddle to play at all.
     
  7. AlexV

    AlexV Heteroscedasticitate
    Moderator Veteran

    Joined:
    Mar 15, 2005
    Messages:
    2,535
    Likes Received:
    144
    Surely you jest.
     
  8. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,541
    Likes Received:
    964
    I certainly remember a lot of sites saying it would be smaller than GT200b, but that was months ago.
     
  9. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    Oh no, oh no. FUD was everywhere
     
  10. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,245
    Likes Received:
    4,465
    Location:
    Finland
    I chose "WTF on both"
    Why?
    Well, the performance isn't bad by any definition, the price... well, it's not TOO far off, but the power consumption is just plain and simple ridicilous
     
  11. DemoCoder

    Veteran

    Joined:
    Feb 9, 2002
    Messages:
    4,733
    Likes Received:
    81
    Location:
    California
    I think if you already own a card that is 6-12mos old, they're no practical reason to desire Fermi, I mean, even if it delivered say, 30% better performance at the same power, chances are, it's not going to matter much in practice. So if you've already got a current gfx card, 'meh' is pretty much a given, and I think, a vote of somewhat dubious value. The only people who upgrade otherwise are people who have an emotional need to simply own the newest best thing.


    The audience for something like Fermi is people who haven't bought a new card recently, that is, someone actually in the market for a new card.
     
  12. XMAN26

    Banned

    Joined:
    Feb 17, 2003
    Messages:
    702
    Likes Received:
    1
    I voted Meh for both.
     
  13. XMAN26

    Banned

    Joined:
    Feb 17, 2003
    Messages:
    702
    Likes Received:
    1
    You heard about the BIOS update Nvidia issued didn't ya? That BIOS update propably disables those 2 cores, lower power consumption and maybe raised clocks a bit. I'd like for someone to release the original BIOS that was on these cards before that update was issued so we could see the differences between the two.
     
  14. jaredpace

    Newcomer

    Joined:
    Sep 28, 2009
    Messages:
    157
    Likes Received:
    0
    "Meh" for me. The redesign's performance actually sucks more than I thought it would. But in the end they discovered higher core speeds yielded better overall gaming #'s than 32 additional "CUDA cores". It was a bonus because that decision probably increased producible quantities for the 480 sku. On the flip side, it evolved into a space heater, and not figuratively. 0.99 vGpu? Am I the only one who thinks this thing would run cooler without the heat spreader? I do think it has potential regarding the dx9 numbers for Dice & Dunia. Efficiency is a bust too. "Meh."
     
  15. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,426
    Likes Received:
    10,320
    I'm not sure how much they CAN tune that.

    They've already lowered fan speed as much as possible without letting the card idle over 100c. And it still sounds like a dustbuster.

    And that's in a well ventilated case or open air environment. In other circumstances it may well end up idling over 100c.

    Regards,
    SB
     
    #135 Silent_Buddha, Mar 27, 2010
    Last edited by a moderator: Mar 27, 2010
  16. (sic)Klown12

    Newcomer

    Joined:
    Mar 27, 2010
    Messages:
    2
    Likes Received:
    0
    In other words, you were wrong and you're grasping at straws to avoid admitting it.
     
  17. XMAN26

    Banned

    Joined:
    Feb 17, 2003
    Messages:
    702
    Likes Received:
    1
    Yeah thats it, because. oye. I'd like to see the original BIOS compared to the one they issued.
     
  18. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,541
    Likes Received:
    964
    It's pretty obvious that NVIDIA originally intended the highest bin to have all 512 SPs enabled. In other words, XMAN had information that was originally correct, it just became outdated at some point. That kind of thing happens all the time...
     
  19. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,022
    Likes Received:
    122
    You mean this one? http://www.computerbase.de/artikel/...geforce_gtx_480/5/#abschnitt_skalierungstests

    I think some reviews quoted the 529mm^2 figure, originating from here I guess:
    http://www.nordichardware.com/en/news/71-graphics/10909-geforce-fermi-die-measures-529-mm2-.html

    Actually, I think I need to revisit my opinion about the fermi architecture a bit. I think it's got potential, and it in fact did improve in terms of perf/area vs. competition quite a bit (now I'd argue it's more due to bad scaling from rv770 to rv870 but still), though not closing the gap. But the chip is just too flawed to take any advantage of this (way late, no full configuration, low (mem) clock, terrible power consumption).
    Some random thoughts for that theory:
    - GTX285 and GTX480 have about a similar lead over the top (single) AMD cards on average (HD4890 and HD5870 respectively) - though GTX480 is very close in a lot of titles, never really loses by much and sometimes is quite a bit faster. And it would be better with full configuration, obviously.
    - Die size difference is smaller in percentage (going by the 529mm^2 figure):
    282mm^2 (rv790), 480mm^2 (gt200b, 70% larger) vs. 330mm^2 (rv870), 529mm^2 (gf100, 60% larger).

    But the implementation of the chip is just too broken for the cards to really be considered good.
    More random thoughts:
    - AMD doubled its units, transistor count from HD4890 to HD5870 going from 55nm to 40nm, keeping clocks the same - and more importantly, (load) power draw is pretty much the same too.
    - NVIDIA basically doubled its units (well not exactly but you get the idea) and transistor count from GTX285 to GTX480 going from 55nm to 40nm, keeping clocks (which were already a lot lower than on g92b) roughly the same (bit lower actually) - and nearly doubling power draw.

    So I have to agree with Jawed, might be better to judge the architecture based on Fermi derivatives (GF104) or at least that B1 respin. Though I think Juniper is actually a tougher opponent to beat in the perf/area department than Cypress, still things might not get too ugly for nvidia if those derivatives don't suffer from the same problems... Well if they appear before N.I that is...
     
  20. Arty

    Arty KEPLER
    Veteran

    Joined:
    Jun 16, 2005
    Messages:
    1,906
    Likes Received:
    55
    Die size is 530mm2 according to muropaketti. I'm going to take his word so Rys was misled by his sources.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...