AMD RX580 Reviews

Discussion in 'Architecture and Products' started by Clukos, Apr 18, 2017.

  1. Clukos

    Clukos Bloodborne 2 when?
    Veteran Newcomer

    Joined:
    Jun 25, 2014
    Messages:
    4,462
    Likes Received:
    3,793
    The average power consumption is even worse

    [​IMG]

    What do you mean about worst case scenario? AMD released the 580 like that themselves, nobody demanded for them to refresh Polaris afaik.
     
  2. Lightman

    Veteran Subscriber

    Joined:
    Jun 9, 2008
    Messages:
    1,804
    Likes Received:
    475
    Location:
    Torquay, UK
    I'm not defending this product as for me the last 2 gen. of high end GPUs from AMD didn't cut it, just the totalitarian notion of state imposed 'criminal' metric. There are different shades of grey too.

    Regarding your point of no one asked AMD to release RX580, I disagree. This is not how OEM and market in general works. I don't like it though
     
  3. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    I have read most of the reviews of this card, its at least 40% to 66% worse in power efficiency than the 1060(anywhere from 60 watts all the way to 100 watt differential), it definitely puts it in the range of the gtx 970 perf/watt and less than. This puts this Polaris card vs Pascal is much worse than Fiji or r390x vs Maxwell!

    There is no defending this. Anyone trying to defend it can look at many if not ALL the reviews to see its not just TPU that came up with the differential. Shit, anything above 180 watts that is 1080 level, and this card is pulling over 200 watts at times, that's 1080ti level! Polaris's power efficiency SUCKS, its pushed way to high and its out of its range.
     
    homerdog, CSI PC, xpea and 2 others like this.
  4. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,933
    Likes Received:
    1,628
  5. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,496
    Likes Received:
    910
    Has anyone tried to downclock/downvolt the RX 580 to see if Polaris 20 brings any efficiency improvement over Polaris 10 at the same clock speed?
     
  6. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    10,001
    Likes Received:
    4,573
    Anandtech did, with mixed results.

    Though I don't think @Ryan Smith altered the RX580's vcore to the same values as the reference RX480, which IIRC is a bit lower in the latter. Pretty much like Hawaii -> Grenada, it'll probably come down to BIOS adjustments and small manufacturing advances.
     
    Alexko likes this.
  7. firstminion

    Newcomer

    Joined:
    Aug 7, 2013
    Messages:
    217
    Likes Received:
    46
    AdoredTV got some very interesting results from undervolting.

     
    Alexko likes this.
  8. snarfbot

    Regular Newcomer

    Joined:
    Apr 23, 2007
    Messages:
    575
    Likes Received:
    188
    i dont wanna watch a video tell me what happens pls
     
  9. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    10,001
    Likes Received:
    4,573
    That's not just undervolting, it's the Radeon Chill working as intended.
    Chill is awesome, and it might even more than compensate for the power consumption differences to the GTX1060 measured in the reviews.

    Undervolting brings GPU power consumption down to 145-150W. Radeon Chill on WoW (ideal title for its simplicity though) gets it between 40 and 70W.


    I hope AMD can eventually make Radeon Chill robust enough to just have it enabled by default by setting the framerate target to the monitor's highest refresh rate.
    So many people with 60Hz monitors thinking they're getting something from 200 FPS.
     
    Alexko, snarfbot, Lightman and 2 others like this.
  10. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY

    AMD already has that tech and its not Radeon Chill. FRTC.

    Radeon Chill the way it works, just won't be able to do it all the time, so no it will never be able to compete with the 1060 in power consumption, unless you just don't move around or go in straight line.
     
  11. firstminion

    Newcomer

    Joined:
    Aug 7, 2013
    Messages:
    217
    Likes Received:
    46
    Please watch the video again, in ashes undervolting got the power down to 130W-140W, heat decreased by 6º and performance was better (GPU was able to keep the maximum boost).

    On WoW that value 145W-150W was with default voltage. Undervolting shaved roughly 30W from that and then Chill halved it.
     
    Lightman likes this.
  12. CSI PC

    Veteran Newcomer

    Joined:
    Sep 2, 2015
    Messages:
    2,050
    Likes Received:
    844
    I think Radeon Chill really needs to monitored correctly with the right power measurement tools, which only 3 sites do.
    This is Tom's result with Witcher 3 and tbh it really cannot be compared in any way to power consumption behaviour for an efficient design such as Pascal that is much more consistent in all situations.

    [​IMG]

    Even if the gains are better when measured with scopes on a 580 (only PCPer and Tom's use oscilloscopes for measuring power behaviour, with Hardware.fr next closest in terms of analysis) it will still suffer greatly from being a more dynamic variable behaviour.
    And the games involved will of course impact this differently and the benefit from Radeon Chill, along with at times unpredictable and inconsistent performance from a gaming perspective; I know a few who find it stutters/judder type effect for their playstyle with a supported MOBA game if they enable it.
    Undervolting could be considered for all GPU manufacturers, for Nvidia more so with their higher tier GPUs.
    Cheers
     
    #32 CSI PC, Apr 21, 2017
    Last edited: Apr 21, 2017
    pharma likes this.
  13. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    10,001
    Likes Received:
    4,573
    Games like Civilization (don't know if it's supported BTW) would probably get immense gains from Chill. As shown in the video above, simpler (yet very popular) games like WoW get in the same boat as well.
    This would need to be a case-by-case study and it could vary immensely from gamer to gamer.

    My point is for someone who plays lots of Civilization and WoW, using a RX480 with Chill at 40-75FPS (typical Freesync range BTW) could result in a lower total power consumption to someone using an identical system but with a GTX 1060.
    For someone playing lots of Battlefield 1 and COD the GTX1060 would obviously get a lower power consumption.


    You might be right. I confess I skipped through the video a bit because the guy's accent makes my brain hurt from trying to understand what he's saying.
    :embarrased:
     
  14. CarstenS

    Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    4,798
    Likes Received:
    2,056
    Location:
    Germany
    You are aware that GF cards do clock down dynamically as well, as long as you do not select "prefer maximum performance"? Even more so with framelimiter or VSync in place. But I agree that it would be an interesting comparison to see if AMDs profile-based technique has a more pronounced effect here.
     
  15. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,933
    Likes Received:
    1,628
  16. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    10,001
    Likes Received:
    4,573
    Mindless data dump that results in a terrible and very ignorant article.
    I never heard of that website and I guess there's a really good reason to.

    1 - The SteamVR test obviously didn't have the "-multigpu" command enabled, so he's probably just running AFR forcing both cards to render two viewpoints each, which is a terrible idea. Using 2*290X, I get the full 11 score with no drops using the multigpu command, and I doubt two RX580 would get him less than that. A 2-minute google search would have given him that information.

    2 - Ashes of Singularity supports explicit multi-GPU and it gets very good scaling on both brands, but explicit mGPU means he'd have to disable Crossfire in the settings. The guy didn't bother doing a 1-minute google search before doing the tests, so he gets zero scaling.

    3 - Doom doesn't support mGPU in Vulkan. A 30-second google search would have told him that. Let's just spend tens of hours testing 20+ graphics cards to reach that conclusion instead.

    4 - Deus Ex Mankind Divided gets ridiculous scaling using DX12, but let's just test it on the DX11 runtime that doesn't support mGPU, because reasons. Or rather because the guy lacks google search skills.



    What really bothers me is the guy somehow has access to a myriad of graphics cards but uses them to make largely ignorant articles.

    It's not like multi-gpu is great for everyone nowadays. It largely depends on what games one is willing to play, really. Two RX580 at their current prices are certainly not worth it IMO, but if one could get two 8GB RX480 for $180 each (like I got one a couple of weeks ago), or even two 8GB RX470 for $150 each using rebate promotions and the like, then it's definitely worth a look.
     
    Kej, Silent_Buddha, Malo and 7 others like this.
  17. Malo

    Malo Yak Mechanicum
    Legend Veteran Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    7,042
    Likes Received:
    3,114
    Location:
    Pennsylvania
    Said no one ever.
     
    ImSpartacus likes this.
  18. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,933
    Likes Received:
    1,628
  19. entity279

    Veteran Regular Subscriber

    Joined:
    May 12, 2008
    Messages:
    1,229
    Likes Received:
    422
    Location:
    Romania
    I'm definetly going for an RX 550 as a sidegrade until Vega comes & spare afterwards. I *need* a replacement for my gtx 460 which is noisy, has an aging video decode engine and can only provide 30 hz refresh rate for one of my monitors.

    Tempted to buy one right now but I'd very much like a passively-cooled design.
     
    Silent_Buddha likes this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...