NVIDIA Maxwell Speculation Thread

Discussion in 'Architecture and Products' started by Arun, Feb 9, 2011.

Tags:
  1. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,579
    Likes Received:
    4,799
    Location:
    Well within 3d
    It's squiggly lines with no particular sign of context or understanding, and it makes comparisons to other architectures that it has neglected to profile and above all else failed to log something as fundamental as what clock and voltage steps were being used. It's hard to credit the squiggles to something they didn't keep track of.

    It's very likely that there is a lot of microsecond-scale variation, but why stop at GPUs, heck why stop at chips made in this decade?

    For reference, AMD's marketing for the 290 pointed out their power control method could operate in 10 usec increments, so what are the odds that there would be signs of variation on the oscilloscope for Hawaii?


    One additional nitpick: what's with the "die shot" being bandied about.
    Is there entertainment value on putting some grayscale anonymous chip as a base layer and then playing space invaders on top?
    Is this somehow preferable to AMD's method of *nothing*?
     
  2. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,112
    Location:
    New York
    I suspect most consumer electronics would exhibit similarly spastic power consumption if measured at microsecond granularity.
     
  3. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,541
    Likes Received:
    964
    All good points, but I for one wouldn't have expected the squiggly lines to be, well, that squiggly. I mean, you can see power going up and down by about 150W within just 30µs. And since there's no guarantee that the sampling frequency is high enough to correctly capture the signal, reality might be even more starking. I guess the squiggly lines are pointless if you already new that, but I didn't.

    I'd heard about PowerTune, of course, and I knew that all chips can have spikes above their TDP, I just thought power draw was far more continuous. So I'm glad I've learned something, even if it is quite poorly framed in the article.

    I think the goal is to promote the notion that CUDA cores are real cores, since you can "clearly see them, just look!". I suspect it's actually quite effective as a marketing technique.

    Whether those pictures should appear in independent reviews, however, is another story. But I'm not sure all reviewers realize they're not real die shots.
     
  4. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,112
    Location:
    New York
    Well yeah there is entertainment value and from a marketing standpoint it's far more effective than nothing. Of course if you're looking for an actual die shot then not so much. It all depends on what you're looking for :)

    You raise an interesting point though. Why is it that AMD shares so few details about its architecture. For example just compare how they unveiled Tonga's compression capabilities. nVidia went a step further in detailing how it works and what was changed.

    Did AMD publish a white paper or something similar for either Hawaii or Tonga?
     
  5. firstminion

    Newcomer

    Joined:
    Aug 7, 2013
    Messages:
    217
    Likes Received:
    46
    Spend the bandwidth.

    "That's not the only offering that makes a good impression, though. Nvidia's reference GeForce GTX 980 does well too, as long as you don’t focus on the idle power measurement. And the party ends as soon as you look at the compute-based stress test results. A taxing load just doesn't give Maxwell any room for its optimizations to shine.

    When it comes down to it, our most taxing workloads take Maxwell all the way back to Kepler-class consumption levels. In fact, the GeForce GTX 980 actually draws more power than the GeForce GTX Titan Black without really offering more performance in return."
     
    #2245 firstminion, Sep 19, 2014
    Last edited by a moderator: Sep 20, 2014
  6. lanek

    Veteran

    Joined:
    Mar 7, 2012
    Messages:
    2,469
    Likes Received:
    315
    Location:
    Switzerland
    I was do it more simply after have seen some beahviour in TDP calcutation ( like Furmark 60W under a game result, because this is the first time we see it ) ... and so i was collect data on review for it, a long work... ok with material for it, it is more easy .. The first thing who have a bit intrigate me is how Guru3D who set a calculated TDP weas arrive at 171W when in general they are way under the official TDP ), furmark result was make me thing of some really aggressive TDP limiter software setting... )
     
    #2246 lanek, Sep 20, 2014
    Last edited by a moderator: Sep 20, 2014
  7. LordEC911

    Regular

    Joined:
    Nov 25, 2007
    Messages:
    877
    Likes Received:
    208
    Location:
    'Zona
    I never said it didn't consume, on average, less power than Hawaii. I simply stated that in overclocked situations, it consumes as much as Hawaii and that their new boost is causing power consumption to be much higher than the TDP.

    Again, you are putting words in my mouth. See above.
    You chart proves my point that GTX980 and GTX770, with current power measurements, are consuming roughly the same with a 65w difference between TDPs.

    I'm simply pointing out there is something funky going on here and it is misleading.

    Where have I said anything pro AMD here? I'm trying to discuss why these changing metrics from manufacturers are OK...

    GM204 is pretty much exactly what I thought it would be, it is a great GPU. I wasn't overly surprised by the launch other than the high resolution performance and Nvidia sticking to a false TDP.
    I heard word about a month ago that the "rated" TDP for the cards is not accurate. That seems to be the case. I apologize for trying to discuss the matter here. I didn't realize this is a praising to our Allah Jen-Hsun only area.
     
  8. rpg.314

    Veteran

    Joined:
    Jul 21, 2008
    Messages:
    4,298
    Likes Received:
    0
    Location:
    /
  9. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,112
    Location:
    New York

    Yep.
     
  10. Cookie Monster

    Newcomer

    Joined:
    Sep 12, 2008
    Messages:
    167
    Likes Received:
    8
    Location:
    Down Under
    I don't think ive ever seen someone do a power measurement in a space of 1ms. This is only useful when measuring losses through say a mosfet e.g. switching losses.

    Plus all that could also just be high frequency noise (that may look like very high peaks on the scope which could be alarmingly to the novice but its not) which is present in almost ALL power supply rails that one might come across. To minimize the noise, one has to measure it with the ground loop of the oscilloscope minimized and perhaps also filter it out/bandwidth limiting the oscilloscope. Not to mention the noise/accuracy on the current sense resistor or whatever they use to measure the current.

    I think until he shows whats really being shown on the scope and the accuracy of the measurements because that is the most important (calibration of the measurements not the equipment themselves).. it may not provide an accurate picture for Maxwell's power efficiency or as a matter of fact any of the cards measured in this way.

    edit - or it could even be the ripple from the 12V rail. The actual power consumption of the GPU is after all the power circuitry (which could add an extra 1~30W loss depending on its efficiency which also happens to be related to the cost) and often the voltage is very very stable (at the GPU voltage) with current fluctuating due to the load. I would of assumed that nVIDIA did all the proper measurements here.
     
    #2250 Cookie Monster, Sep 20, 2014
    Last edited by a moderator: Sep 20, 2014
  11. babcat

    Regular

    Joined:
    Sep 24, 2006
    Messages:
    656
    Likes Received:
    45
    It is a disgrace in my opinion that nvidia released a new flagship card that is only slightly more powerful than the 780ti. In my opinion, they should have waited until they could relesse a card with a good leap in performance or not released anything at all. What they really need to do is push access to 20 or 16nm. If would have waited to launch this card on 16nm with 3000+ CUs it would have been far more powerful than the 780ti.
     
  12. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    Absolutely! How dare they release a product that gets unanimous positive reviews!
     
  13. ams

    ams
    Regular

    Joined:
    Jul 14, 2012
    Messages:
    914
    Likes Received:
    0
    Well he is overgeneralizing here, because the Maxwell-based GTX 750 Ti has MUCH lower power consumption in Tom's "Torture GPGPU" test vs. any comparable Kepler or Radeon GPU. And to actually gauge efficiency, one would need to see both GPGPU power consumed and performance (the latter which was not provided at all as far as I can tell).

    Note that GTX 980 actually handily outperforms GTX 780 Ti with respect to compute performance in most cases (excluding double precision compute of course):

    http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/20

    [​IMG]
     
    #2253 ams, Sep 20, 2014
    Last edited by a moderator: Sep 20, 2014
  14. babcat

    Regular

    Joined:
    Sep 24, 2006
    Messages:
    656
    Likes Received:
    45
    The reviewers are nuts or they are sold out. The GTX 980 is no where near powerful enough to cost 549. It should be 399 at most. If they want to sell a graphics card at a flagship it needs to be significantly more powerful than the previous flagship. I think NVIDIA should have waited to put out a card on a smaller process node.

    The fact is uses less power is also almost meaningless to me. I think that any flagship should reach whatever max thermal limit exists. For example, they should have added more CUs to this card until it hit 250 watts.
     
  15. boxleitnerb

    Regular

    Joined:
    Aug 27, 2004
    Messages:
    407
    Likes Received:
    0
    You neither understand what TDP means nor do you understand that these are GeForce cards meant for gaming.

    First, TDP is an average value. Short power spikes are completely irrelevant here since energy transfer to the cooler is a much much slower process. And secondly the maximum load test Tom's did is not a gaming workload. Thus it is also not relevant. If this were a Quadro or Tesla card...but it isn't.

    It is also quite unlikely that the OC'ed card in their review consumes LESS power than the reference model. That makes no sense at all since it likely uses higher voltages and (due to the cooling solution) higher sustained boost. This alone eats at the credibility of their measurements. And where do show power measurements for Kepler cards they compare it to to better get a context of these values?
     
  16. RecessionCone

    Regular Subscriber

    Joined:
    Feb 27, 2010
    Messages:
    505
    Likes Received:
    189

    GM204 is not the flagship. I don't think you'll have to wait too long to see what Maxwell looks like at 250W... All these complaints about GM204 not being a flagship are a little short sighted IMO.
     
  17. Wesker

    Regular

    Joined:
    May 3, 2008
    Messages:
    299
    Likes Received:
    186
    Location:
    Oxford, UK
    Heh, it's been a while, LordEC. I think last time I posted here we were all going on about RV770 vs GT200. Those were good times.

    I think your point is entirely valid. While Maxwell is a very impressive GPU uArch, some of the thermal characteristics of the GTX 980 ought to be explored. I don't see any premise for rash, knee jerk reactions that some users herel have resorted to.
     
  18. LordEC911

    Regular

    Joined:
    Nov 25, 2007
    Messages:
    877
    Likes Received:
    208
    Location:
    'Zona
    I know exactly what TDP means and I understand GeForce is for gaming. Thanks for checking though.

    I wasn't specifically talking about the Tom's measurements, I actually wasn't aware of that until after my first few posts. I typically don't read Tom's reviews.

    So feel free to actually read my posts and follow along before replying next time.

    Been way too long. Good to see you back and posting.
    Edit- Checked your post history, Dec 2009 was your last post. Almost 5 years.
     
  19. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,112
    Location:
    New York

    The 980 is a better card than the 780 Ti in nearly every way and is cheaper too. That's a good thing for anyone who doesn't already own the previous flagship - which is a lot of people. I assume you're so disgusted by the whole thing cause you're already using a 780 Ti.
     
  20. boxleitnerb

    Regular

    Joined:
    Aug 27, 2004
    Messages:
    407
    Likes Received:
    0
    You're still wrong. The powertarget of the card caps consumption at 165W which is the TDP. It may not be the fastest implementation, hence some spike measurements for instance at TPU, but under sustained gaming load please prove that the TDP is exceeded before making such claims.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...