AMD Vega Hardware Reviews

Discussion in 'Architecture and Products' started by ArkeoTP, Jun 30, 2017.

  1. Cyan

    Cyan orange
    Legend Veteran

    Joined:
    Apr 24, 2007
    Messages:
    8,572
    Likes Received:
    2,292
    BacBeyond likes this.
  2. Leier

    Newcomer

    Joined:
    Jun 30, 2017
    Messages:
    31
    Likes Received:
    22
    You don't say, Capt. Obvious? :)
     
  3. yuri

    Newcomer

    Joined:
    Jun 2, 2010
    Messages:
    178
    Likes Received:
    147
    It seems AMD's firmware team will never learn. It's always been like a bunch of juniors with no QA.

    They got brilliant engineers like Mr. Naffzieger inventing all shiny power/AVFS stuff like adaptive clocking, fine grained sensors, voltage adaptation on boot, silicon aging probing/calibration, etc. On the other hand they have Vega running its HBM2 about 0.5GHz lower, horrible Ryzen launch AGESA, Kaveri dropping its speed exactly to 3.0GHz any time you put any load to GPU, etc.
     
  4. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,930
    Likes Received:
    1,626
    Lightman, Leier and Geeforcer like this.
  5. Cat Merc

    Newcomer

    Joined:
    May 14, 2017
    Messages:
    124
    Likes Received:
    108
    Notice that they said for absolute stability, they needed 1130mV. A 70mV safety band is pretty tame. I wouldn't call Vega overvolted. NVIDIA has an even larger safety band from what I've seen but you don't see anyone running around blaming them for overvolting at stock.

    Simply put, asking for lower voltage is unrealistic.
     
    xpea, Silent_Buddha, T1beriu and 7 others like this.
  6. Geeforcer

    Geeforcer Harmlessly Evil
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,297
    Likes Received:
    464
    ArkeoTP, ShaidarHaran, Malo and 3 others like this.
  7. Leier

    Newcomer

    Joined:
    Jun 30, 2017
    Messages:
    31
    Likes Received:
    22
    First of all NVIDIA has made a much better physical layout (see Perf/Power on GP108) and also a way better binning. Better binning = lower voltage.
     
  8. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    Nice way to miss the point, but at least you seem to have company.

    There are other display technologies that have the same behavior: plasma, some OLED, some LCD backlights, ... Anything where here is some kind of pulsing involved. And since this is a field of a lot of research, who knows what the future will bring.

    HDMI is an interconnect standard and the core spec is about being able to push large amounts of pixels.

    The job of HDMI is not to enforce the presence of specialty features. Variable refresh rate is a specialty feature, just like, for example, audio or 3D stereo are specialty features.

    It'd be ridiculous to enforce a specialty feature that can't even be supported by all displays.

    Furthermore, variable refresh rate is useless for, say, a future 8K workstation monitor that only needs BW.

    It makes total sense to separate a core minimum spec from side features that are lot needed for many cases.

    Do you expect the HDMI spec to contain a rule "variable refresh rate is not required, except for LCDs with the following backlight, for those support is mandatory" ?
     
    pharma likes this.
  9. BacBeyond

    Newcomer

    Joined:
    Jun 29, 2017
    Messages:
    73
    Likes Received:
    43
    Who says the displays have to require it? It could merely be required for all output devices to support it. So it you want to have a device that can output 2.1 it has to be able to support VRR, not that VRR is required for all screens.
     
    Silent_Buddha likes this.
  10. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    13,277
    Likes Received:
    3,726
  11. sir doris

    Regular

    Joined:
    May 9, 2002
    Messages:
    651
    Likes Received:
    110
    Surely if the voltage is a set value, reducing the power will reduce the current.
    P = I x V or
    I = P / V
     
    #331 sir doris, Jul 20, 2017
    Last edited: Jul 20, 2017
  12. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    13,277
    Likes Received:
    3,726
    Yah, I get that.

    There's just some terminology confusion here. They talk about increasing power target while "under-volting" the card, but also conclude it draws less W in that state ... so "power target" really is current? Current should increase when you lower voltage, but then power is the wrong word. Just strange, because they use power to mean watts or current interchangeably.

    Or I guess the load is fixed in this case, so if you lower voltage current will drop?
     
    #332 Scott_Arm, Jul 20, 2017
    Last edited: Jul 21, 2017
  13. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,996
    Likes Received:
    4,570
    Power target is a setting in wattman and it defines the GPU consumption in watt. If you fix the core's voltage V then per the P=V.I formula, setting the power P is done by changing the current I.
    If V is constant then dP = V.dI
     
  14. MDolenc

    Regular

    Joined:
    May 26, 2002
    Messages:
    690
    Likes Received:
    425
    Location:
    Slovenia
    So an IPTV set-top box that only does an MPEG decode at constant rate would also need to support variable refresh rate?
     
  15. Mize

    Mize 3dfx Fan
    Moderator Legend Veteran

    Joined:
    Feb 6, 2002
    Messages:
    5,048
    Likes Received:
    1,097
    Location:
    Cincinnati, Ohio USA
    I'm confused. Why would variable refresh exclude constant refresh? Isn't the latter a subset of the former?
     
    Silent_Buddha likes this.
  16. ieldra

    Newcomer

    Joined:
    Feb 27, 2016
    Messages:
    149
    Likes Received:
    116
    Never! The poor thing is a reference edition running at pitifully low clocks, it's well behind a 1070 even
     
  17. homerdog

    homerdog donator of the year
    Legend Veteran Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,153
    Likes Received:
    928
    Location:
    still camping with a mauler
    All the cards in that chart seem to be reference so I'm not sure what you're getting at. I guess you could argue that the 980Ti overclocks very well (especially compared to AMD cards) and as such will come out better in a comparison of overclocked cards.

    But my goodness, I was not expecting this level of suck from Vega. It makes R600 look like a resounding success. What the hell happened?
     
    #337 homerdog, Jul 21, 2017
    Last edited: Jul 21, 2017
  18. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    How would it make sense to require all output devices to support variable refresh rate when the vast majority of source content isn't variable refresh and probably never will?

    How would you even test for it?

    "Hey 8K BlueRay player, produce some variable refresh rate content out of thin air, otherwise you don't get the HDMI stamp!"
     
  19. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    It wouldn't. All I'm saying is that it'd be dumb for HDMI to require support for devices to support variable refresh rate in order to get certified.
     
  20. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,061
    Likes Received:
    1,021
    Well, that is what all the debate is about, isn't it? Just expanding polaris to 64CUs, 512bit GDDR5, and running at 1400MHz would be a bit smaller than Vega, wouldn't draw more power, and would perform much better than we have seen so far. So why Vega, and more than a year later?
    I'm not comfortable with the assumption that AMD graphics folks are totally incompetent however. I hope AMD introduces Vega properly, explaining their design goals, the features of the new design et cetera, giving us something better than just a few puzzling benchmark scores.
     
    digitalwanderer and Cat Merc like this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...