NVIDIA Maxwell Speculation Thread

Discussion in 'Architecture and Products' started by Arun, Feb 9, 2011.

Tags:
  1. DuckThor Evil

    DuckThor Evil Anas platyrhynchos
    Legend Veteran

    Joined:
    Jul 9, 2004
    Messages:
    5,878
    Likes Received:
    897
    Location:
    Finland
    2 X 970 is a pretty impressive package with regards to price, performance and features. GTX 990 with two 980 cores wouldn't suck either, the low power consumption figures are basically pegging for a card like that :)
    Perhaps a GTX 990 and Maxwell Titan in early 2015? $999 each?
     
  2. Cookie Monster

    Newcomer

    Joined:
    Sep 12, 2008
    Messages:
    167
    Likes Received:
    8
    Location:
    Down Under
    Even if there were "one micro second spikes" as suggested by THG, i don't think it would do much to the TDP (or the average heat dissipated from the GPU + some margins perhaps) because the temperature can't just rapidly change within that time period.

    As long as the average power dissipated by the GPU is within its TDP, the heatsink designed with the TDP in mind should be able to do its job nicely. Not sure why people are freaking out here :shock:

    Sounds to me like nit-picking because there must be a flaw right! Anyway if the GM204 can do this given the relative power requirements.. think what the big Maxwell's can do :twisted:
     
  3. CarstenS

    Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    4,943
    Likes Received:
    2,286
    Location:
    Germany
    I would be very interested in the type of the mentioned compute load. Even if you only take plain averages, you should be able to see significantly higher power values at the wall socket integrated over time - especially with a large difference at almost a hundred watts. Maybe Igor could elaborate?

    OMG you're completely right! Frankly, I really don't get it why any company bothered to release any graphics card after the mid-eighties, since it was very clear from the outset that they all lack the power to do convincing virtual reality simulations to a sub-nuclear level.
     
    #2263 CarstenS, Sep 20, 2014
    Last edited by a moderator: Sep 20, 2014
  4. firstminion

    Newcomer

    Joined:
    Aug 7, 2013
    Messages:
    217
    Likes Received:
    46
    It's a 970/980 gpu review with a methodology that measures GPU consumption on all rails with very high resolution and it demonstrates that 980 reference consumes +100W on (non-specified) GPGPU workloads while 970 reference consumes +72W. 750Ti shows the same consumption on gaming and GPGPU, yes, but 750Ti is arguably irrelevant on GPGPU.

    And on contrary of what was argued here, that consumption isn't a mere "spike", it's consistent for at least 1 minute. If anything, we see occasional "drops" instead of "spikes".

    [​IMG]

    We also see this increase on AMD cards, R9 290X reference on quiet mode consumes +61 and R9 280X reference consumes +32W, this means that while on gaming 970 reference consumes 20% less than R9 280X and 980 reference consumes 25% less than R9 290X reference on quiet mode, but on those GPGPU workloads 970 consumes only 1% less while 980 consumes only 7% less.

    So yes, I suppose we have evidence that there's some optimization there, showing that Maxwell is a result of clever engineering, not some magic fairy dust. And that it has it's limitations.

    I would like to congratulate Igor Wallossek on his findings and call the rest of the tech reviewers to follow up on this.
     
  5. AnarchX

    Veteran

    Joined:
    Apr 19, 2007
    Messages:
    1,559
    Likes Received:
    34
    Strange results at THG.
    GTX 750 Ti (5SMM @ >1020MHz): 58W
    GTX 970 (13SMM @ >1050MHz): 240W

    The BIOS files at TPU does not mention power values, like on GM1xx or GKxxx cards.
    Mabye power management is done by the GPU on GM2xx with support of driver profiles? And THG found a non-profiled app, Furmark is restricted.
     
  6. RecessionCone

    Regular Subscriber

    Joined:
    Feb 27, 2010
    Messages:
    501
    Likes Received:
    178

    Yes, me too. Hopefully there exist tech reviewers with a modicum of electrical engineering background out there to explain publically why this type of measurement is meaningless. It's just capturing noise.
     
  7. eastmen

    Legend Subscriber

    Joined:
    Mar 17, 2008
    Messages:
    11,152
    Likes Received:
    2,216
    http://blogs.nvidia.com/blog/2014/09/18/maxwell-virtual-reality/

    This might make me switch to NVidia for the first time since their FX line. The ability for each gpu to render what one eye sees is pretty big

    Hopefully AMD can come up with some similar stuff so that we have options for VR.

    I'm not planning to upgrade my 7950 till the consumer rift hits. So it will be interesting to see how this hsakes out
     
  8. Arun

    Arun Unknown.
    Moderator Legend Veteran

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    302
    Location:
    UK
    Where does Tom's say what applications they are measuring? If it's a custom workload, have they ever detailed it? Is the performance measured on the same workload so that Perf/W can be estimated on it? There's probably a very interesting story in there, but it's frustrating to see someone with access to such low-level tools not give all the data. It basically forces others to replicate the same analysis with more rigour if we want to conclude anything really interesting...
     
  9. homerdog

    homerdog donator of the year
    Legend Veteran Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,282
    Likes Received:
    1,058
    Location:
    still camping with a mauler
  10. firstminion

    Newcomer

    Joined:
    Aug 7, 2013
    Messages:
    217
    Likes Received:
    46
    Possibly, it needs to be followed upon. Meanwhile we can compare his gaming numbers with what other reviews got, but power consumption tests are a mess.

    Anandtech's ratio between 980 and 290X numbers seem to corroborate Tom's measurements. Guru3D's "calculated TDP" on 970 and 980 are close enough, but 280X and 290X are not (30-40 W over). Techreport is an odd ball because it puts 280X only 13W over 970 and 290X 101W over 980.
     
  11. eastmen

    Legend Subscriber

    Joined:
    Mar 17, 2008
    Messages:
    11,152
    Likes Received:
    2,216
    went from a radeon 970pro to the fx580 right back to the 970pro lol and then the 980pro
     
  12. babcat

    Regular

    Joined:
    Sep 24, 2006
    Messages:
    656
    Likes Received:
    45
    If it is not the flagship then it shouldn't have such a high price. For the slight performance it gives over the 970, it should cost less.
     
  13. CarstenS

    Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    4,943
    Likes Received:
    2,286
    Location:
    Germany
    Is maybe this the key?
    "The measurement intervals need to be adjusted depending on the application in question, of course, in order to avoid drowning in massive amounts of data. For instance, when we generate the one-minute graphs for graphics card power consumption with a temporal resolution of 1 ms, we have the oscilloscope average the microsecond measurements for us first."
    [my bold]
    http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-970-maxwell,3941-11.html

    I don't know anything beyond what the text says, though. But from my experience with Nvidias Boost, mildly higher clocks mean significantly higher voltages. Depending on the averaging and the specific Compute load, this could lead to the measured amps being multiplied with a higher voltage than is actually being applied in reality.

    For example, when heated and loaded enough and thus resorting to base clock (1126 MHz), our sample ran at 1,043 volts with around 80% of allowed Board Power according to Nvidia Inspector. While on highest boost state at 1278 MHz, it was supplied with 1,212 volts, which is quite a difference.
     
    #2273 CarstenS, Sep 20, 2014
    Last edited by a moderator: Sep 20, 2014
  14. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,515
    Likes Received:
    934
    The FX 5900 XT was actually pretty decent for its price.
     
  15. RecessionCone

    Regular Subscriber

    Joined:
    Feb 27, 2010
    Messages:
    501
    Likes Received:
    178

    As with GK110, I expect the first flagship Maxwell to be $1000. You always pay more for the flagship...
     
  16. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,591
    Likes Received:
    673
    Location:
    WI, USA
    FX also tended to be very solid cards for the DX7/8 and OpenGL games that were prevalent back then. The 5800 and 5900 at least. The lower models were not so hot.
     
  17. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,591
    Likes Received:
    673
    Location:
    WI, USA
    Are we going to see the biggest GPU ever or is 20nm essentially required for the monster Maxwell? Adding in more hardware for HPC and such is sure to require a lot of transistors to move significantly ahead of GK110.
     
  18. fellix

    fellix Hey, You!
    Veteran

    Joined:
    Dec 4, 2004
    Messages:
    3,506
    Likes Received:
    424
    Location:
    Varna, Bulgaria
    Nvidia could still throw another 28nm monster now that Maxwell is even more tightly packed. There's still like ~150mm² of leeway for a 384-bit HPC SKU, with gobs of DP throughput and even more L2.
     
  19. fbomber

    Newcomer

    Joined:
    Jun 9, 2004
    Messages:
    156
    Likes Received:
    17
    It is their flagship, as they named it the gtx 980. Its market position speaks for itself.
     
  20. fbomber

    Newcomer

    Joined:
    Jun 9, 2004
    Messages:
    156
    Likes Received:
    17
    You are all overestimating the importance of low power consumption for the high end. I'm waiting for the real deal, gm210.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...