NVIDIA GF100 & Friends speculation

Discussion in 'Architecture and Products' started by Arty, Oct 1, 2009.

  1. GZ007

    Regular

    Joined:
    Jan 22, 2010
    Messages:
    416
    Likes Received:
    0
    The vapor chamber cooler doesnt cost them a single watt. U dont need a bigger fan when u can increase the area of the cooling fins or change materials for better heat conduction.
     
  2. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,541
    Likes Received:
    964
    My bad, I thought it was purely software but it's clearly not. I think Dave's explanation makes sense, though.

    In theory, that could be a problem. But realistically, power viruses don't just pop up every week. With Furmark and OCCT occupying that space, I doubt we'll see anything else appearing any time soon. Even if we did, NVIDIA is usually pretty quick to react.

    Plus, people who run power viruses usually know what they're doing.


    In this case the fan is slightly bigger (If I'm not mistaken), but I think the cooler's higher efficiency is mostly due to its vapor chamber.
     
  3. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    I thought the rated power of the 580 fan was 8 to 10 W lower, but I couldn't actually be bothered to check.
     
  4. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,022
    Likes Received:
    122
    I think rated power is fairly meaningless since they pretty much never reach their max. If they did, people would get deaf :).
    Maybe that's the reason of the power limiter after all, otherwise the fan noise wouldn't be within safety regulations :).
     
  5. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
  6. RecessionCone

    Regular Subscriber

    Joined:
    Feb 27, 2010
    Messages:
    505
    Likes Received:
    189
    Thanks for the link. Figure 2 nicely illustrates the non-linearity of leakage power as a function of temperature. Assuming 2 C/W as a previous poster did imposes a linear model, which the curve in figure 2 suggests would lead to a large overestimation of power saved by lower temperature.
     
  7. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,541
    Likes Received:
    964
    That figure also goes from 0°C to 120°C. From 70°C to 90°C, which is what we're interested in here, it's not exactly linear, but assuming that it is wouldn't lead to significantly off results.
     
  8. no-X

    Veteran

    Joined:
    May 28, 2005
    Messages:
    2,455
    Likes Received:
    471
    Maybe it isn't related only to GPU temperature, but also to temperature of VRMs. I remember a test showing, that different coolers tested on HD2900XT 512MB GDDR3 (TDP 215W) resulted in (up-to) 20W different power consumption in load (the higher RPM, the lower the power consumption was).

    I think it's possible to say, that different coolers on GTX480/580 can also result in ~10% different power consumption.
     
  9. PSU-failure

    Newcomer

    Joined:
    May 3, 2007
    Messages:
    249
    Likes Received:
    0
    I already pointed that.

    Redesigned PCB around the power delivery area, refined VRM circuitry and better cooling together could be the only improvements as far as power efficiency goes.

    When you have 20A flowing between PEG aux plugs and the VRM itself and something like 200A between the VRM and the GPU, Ohm's law isn't that meaningless. Hell, 1 milli-Ohm translates to 40 watt power dissipation in the PCB alone.
     
  10. Ethatron

    Regular Subscriber

    Joined:
    Jan 24, 2010
    Messages:
    948
    Likes Received:
    417
    Let's wait for the Gold-Edition. :grin:
     
  11. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    Looks like the 570 is still 1280MB
     
  12. psolord

    Regular

    Joined:
    Jun 22, 2008
    Messages:
    444
    Likes Received:
    55
    That's not too good, but i guess they have to differentiate the 570 from the 580 somewhat. Other than killing one-two SMs that is. I just hope it ends up close to the 480 perfromance level, with better thermals and power draw. Maintaining the HSF would be very kind of them as well.

    I reckon that 1280MB of video RAM will be more than enough for up to 1920X resolutions for another year at least.
     
  13. ShaidarHaran

    ShaidarHaran hardware monkey
    Veteran

    Joined:
    Mar 31, 2007
    Messages:
    4,027
    Likes Received:
    90
    Any bets on clocks? I'm thinking somewhere right around 480 levels. 580 would still be a good deal faster and 570 would be a decent replacement for 470.
     
  14. psolord

    Regular

    Joined:
    Jun 22, 2008
    Messages:
    444
    Likes Received:
    55
    The 480 is 25% faster than the 470 and my best guess is that Nvidia may want to keep the same distance for the 580-570 again. That would put the 570 below the 480 while giving a decent replacement card as you said without enraging 480 owners.

    TBH I am not sure if Nvidia would care more about enraging previous series owners or countering AMD effectively, so they may want to let AMD launch their cards before Nv launches theirs. This could be a possible reason of why AMD delayed their 69XX cards. They will not give Nvidia enough time to react and decide if the 570 will be a -1SM part or a -2SM part, in order to catch up with the Christmas shopping spree. If that was the case, I would launch a 570 and a 575 at the same time! Too much segmentation? Maybe!

    In any case, returning to the original 25% performance difference, I think the 570 will be a 580 -2SMs at 580 clocks with 470 specs (rops,bus,framebuffer).
     
  15. -The_Mask-

    Newcomer

    Joined:
    Sep 20, 2009
    Messages:
    51
    Likes Received:
    0
    Location:
    The Nederlands
    Somewhat higher.
     
  16. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,541
    Likes Received:
    964
    That would put it really really close to the GTX 580… I'm expecting something like 650~675/1300~1350, which would put it very slightly below the 480, accounting for the fixed TMUs. Basically, a GTX 480 with a power draw similar to that of the 470.
     
  17. -The_Mask-

    Newcomer

    Joined:
    Sep 20, 2009
    Messages:
    51
    Likes Received:
    0
    Location:
    The Nederlands
    Really close to the GTX480 to be exactly.

    7xx/14xx/19xx MHz, 480SP, MC320, GDDR5 1280MB $ 3XX @ expreview

    Somewhat higher then the GTX480 (700/1400/1848), maybe 73x/146x/19xx :roll:
     
  18. Ancient

    Newcomer

    Joined:
    Mar 17, 2010
    Messages:
    120
    Likes Received:
    0
    GeForce GTX 570 Specifications, Release Date Leaked:

    http://techpowerup.com/135450/GeForce-GTX-570-Specifications-Release-Date-Leaked.html

     
  19. Erinyes

    Regular

    Joined:
    Mar 25, 2010
    Messages:
    808
    Likes Received:
    276
    Those clocks seem quite high, the last rumours indicated clocks just shy of 700 Mhz.

    And what about yields? Surely they need another lower bin part to harvest more dies. Or maybe those will be used for Quadro/Tesla
     
  20. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,022
    Likes Received:
    122
    Hmm with these clocks looks like it should perform really close to GTX480 - it looses 13% memory bandwidth (and rops throughput) but core/shader is 4% faster.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...