NVIDIA GF100 & Friends speculation

Discussion in 'Architecture and Products' started by Arty, Oct 1, 2009.

  1. Babel-17

    Veteran

    Joined:
    Apr 24, 2002
    Messages:
    1,073
    Likes Received:
    307
  2. DarthShader

    Regular

    Joined:
    Jul 18, 2010
    Messages:
    350
    Likes Received:
    0
    Location:
    Land of Mu
    And that's even after removing World of Warcarft from the list of benchmarked games, where 6870 got negative scaling.... still, TPU review consist of old stuff, that often gets CPU bound or doesn't scale at all. Their reviews aren't a good source for average numbers, until they fix their choice of games.
     
  3. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,244
    Likes Received:
    4,462
    Location:
    Finland
    Some indications of scaling here too, though this is 570/580 SLI vs 6950/6970 CF
    http://www.computerbase.de/artikel/...vs.-geforce-gtx-500-sli/19/#abschnitt_ratings

    It varies a lot on API & resolution, but for example on their choice of DX10 games at 2560x1600, even HD6850 CF beats GTX 580 SLI

    And remember - computerbase is one of the sites which, unfairly, uses AMD High Quality settings vs nV's Quality settings
     
  4. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    I don't think it's a matter of blaming anyone but a matter of potential sales opportunities. If I had a very small chassis, chances are, I bought it for a reason (or I am f***ed with Acer/HP microscopic cases and fitting PSUs anyway). So, chances are, that I won't upgrade this particular PC with a card unlikely to fit inside. Sales opportunity lost.

    I might, however, build another PC with sufficient space, but not all users will (be able to?) do that.
     
  5. DSC

    DSC
    Banned

    Joined:
    Jul 12, 2003
    Messages:
    689
    Likes Received:
    3
  6. AnarchX

    Veteran

    Joined:
    Apr 19, 2007
    Messages:
    1,559
    Likes Received:
    34
  7. Unknown Soldier

    Veteran

    Joined:
    Jul 28, 2002
    Messages:
    4,047
    Likes Received:
    1,669
    Seems rumours are still not dying like you hoped.

    http://www.xbitlabs.com/news/video/...ship_Graphics_Card_from_Nvidia_Resurrect.html
     
  8. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,418
    Likes Received:
    10,311
    Whoa 2x 8 pin connectors. A ridonkulous cooling solution with 3 fans. If that's what Nvidia is planning, it certainly looks like they are getting ready to give PCIE certification a big middle finger. :D

    Regards,
    SB
     
  9. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    9,470
    Likes Received:
    1,686
    Location:
    Treading Water
    yes clearly that's a reference design from nvidia.
     
    #8029 AlphaWolf, Feb 1, 2011
    Last edited by a moderator: Feb 1, 2011
  10. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    I can't figure out why people seem to think that it's important for a specialty product like his to stay within some power number that's written in a technical spec? Especially if the connectors make it abundantly clear that more than usual power is required.

    There is a PCIe spec and committee and all that, but it's not as if they have any say about what product can or can not be released. There are tons of legacy PCI products that violated the spec one way or the other, but nobody cares as long as they work in commonly used configurations.

    In the worst case, excess power requirements result in the inability to put an official PCIe logo on the box, which, I'm sure, will highly concern some corporate enterprise IT manager.

    If anything, dual 8 pin connectors not being an official configuration is a clear indication that the PCIe spec is in dire need of being updated. Nobody can claim that they didn't see the need for increasing power need coming.
     
  11. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,579
    Likes Received:
    4,799
    Location:
    Well within 3d
    What should the next wattage limit be?

    A quad-card setup of 300W cards with the attendant system (a few hundred W to CPU and misc) to is starting to nudge into the uncomfortable (out of code?) range for 15 amp house circuits.

    The SIG may not have an interest in defining the electrician's certification needed for next greatest GPU slot specification.
     
  12. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    9,470
    Likes Received:
    1,686
    Location:
    Treading Water
    I'm waiting for the version with the external power cord.
     
  13. RecessionCone

    Regular Subscriber

    Joined:
    Feb 27, 2010
    Messages:
    505
    Likes Received:
    189
  14. DarthShader

    Regular

    Joined:
    Jul 18, 2010
    Messages:
    350
    Likes Received:
    0
    Location:
    Land of Mu
    It's a custom EVGA design GTX460x2. Just compare how the area under the chips is populated: http://www.ixbt.com/video3/images/gf104/gtx460-scan-back.jpg

    That dual GF110 board is a few months old too. Old news.
     
  15. Xalion

    Regular

    Joined:
    May 26, 2007
    Messages:
    310
    Likes Received:
    19
    I figure this is probably tongue in cheek, but since it made me curious let me share what I found.

    First code in the US at least specifies that on a standard 110 volt outlet you have 15 amps so 1650 watts of power available (remember this is assuming the fixed 110 voltage of the outlet).

    Assuming you could run 4 dual gpu cards, then at 300 watts each they would consume 1200W meaning the rest of your system would have to consume over 450 to be pushing spec. Keep in mind that while the individual circuits are rated at 15 amps, a lot of the newer wiring done in houses is rated for 20 amps and the difference is a fuse in your fuse box. That pushes the limit up to 2200W, which is comfortable even given that setup. I think Europe is even better off as they run at 220 volts with wire rated at 15 amps for a total draw of 3300 watts at low power.

    I think people who are really planning on running 4 dual GPU cards (that I would guess cost in the $500-600 range each) either have enough money to pay an electrician to change a fuse or enough technical know how to do it themselves. If they are really concerned (and the people who do this type of project often are), they could exploit the fact that many newer houses in the US have 2 circuits per room and just run 2 power supplies on separate circuits. I'm pretty sure they will have to have 2 power supplied anyway to supply that many cards.

    I think the whole situation is a bit of a red herring though. I think this is probably a PR card. AMD has held the "fastest single slot card" for a long time on back of their dual GPU offerings. I have a feeling that this is a chance for NVidia to challenge that title. I seriously doubt they plan on producing these in mass quantities and requiring all future cards to have similar power draws. I think they are just looking to put the "AMD has the fastest card!" argument to rest. I have a feeling they figure the PR gain from that outweighs any PR loss from "Look at how many watts this thing pulls!".
     
  16. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,579
    Likes Received:
    4,799
    Location:
    Well within 3d
    There is usually a safety margin built into the regulatory limit. I've seen 20% bandied about, but that isn't something I can state authoritatively. That would reduce the amount of margin significantly.

    Such a requirement is beyond what the SIG concerns itself with. There's no real interest in researching and ratifying a standard that requires home modification and contract work.
     
  17. RecessionCone

    Regular Subscriber

    Joined:
    Feb 27, 2010
    Messages:
    505
    Likes Received:
    189
    Thanks for the correction. I guess we'll just have to wait a while longer to see how the rumors settle out.
     
  18. Xalion

    Regular

    Joined:
    May 26, 2007
    Messages:
    310
    Likes Received:
    19
    The safety margin only applies if you plan 100% load - ie, the video cards would have to be pushing their full 300 W 24/7, 365. Otherwise it is fine to go up to max load as long as you don't run it more than 80% of the time.

    Unless someone is planning on running a year long Futuremark test, I don't see that as being a concern. As long as they don't exceed the maximum at any point in time they are fine.

    The SIG doesn't care how many cards the computer has. It currently allows 75W from the slot, 1 8 pin connector (150W), and 1 6 pin connector (75W) for a total of 300W. That is independent of you running 1 card in a personal machine of 3000 in a GPU farm.

    I really doubt they base the calculations for card wattage on a massively nonstandard setup (a machine with 4 PCI slots instead of 2 running 4 dual GPU cards). I think it kind of borders on absurd to suggest they would.

    Put another way, take a look at a board like this one:
    ASUSTeK. Now, combine that with a set of risers like this one:
    riser.

    That is only PCI-2.0 and it still would pull around 900W with 6 cards. Yet PCI-3.0 still went to 300 for a mind blowing 1800W just from the video cards in that configuration. I don't think the SIG lost a minutes sleep over it.

    Like I said - red herring. This card is about getting the fastest single slot card on the market. I doubt NVidia cares what the PCI SIG thinks about their card, and I doubt that the SIG will bother trying to research a specification based on one niche market card setup that might pull too much power if joe random sets one and starts a whole suite of benchmarks running nonstop all day for a year.
     
  19. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,579
    Likes Received:
    4,799
    Location:
    Well within 3d
    I don't think the time window needs to be that long.
    It doesn't seem realistic to permit transient peaks past 80% of max rating, and then define transient as over 60,000 continuous hours. That seems pretty constant.

    The SIG cares enough to define a high-end specification that caters to GPU cards. I am not aware of other product lines that commonly approach the 300W limit. They would be aware of the design bullet points for these top end cards.

    There's probably an evaulation of many factors, but I would imagine that expected deployments of a product on commercially available and in-spec platforms is a worthwhile thing to consider.

    That board seems to put the limit at 3-way graphics.
    Pushing the card count higher would be taking the product beyond specification, so the PCIe spec would not be the first problem.
    The reasons for the limit on the motherboard itself may be interesting.

    Then the current 300W specification is not out of date and does not need increasing.
    The typical way these cards are sold is to ship them clocked down to meet the PCIe spec, so that the product can be plugged into a PCIe slot without immediately breaking warranties. The user can then decide to flip the switch, which absolves the manufacturer.
     
  20. Babel-17

    Veteran

    Joined:
    Apr 24, 2002
    Messages:
    1,073
    Likes Received:
    307
    Voodoo volts? :)

    http://en.wikipedia.org/wiki/File:Voodoo_5_6000.jpg

    http://en.wikipedia.org/wiki/Voodoo_5

     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...