AMD RX580 Reviews

Discussion in 'Architecture and Products' started by Clukos, Apr 18, 2017.

  1. Clukos

    Clukos Bloodborne 2 when?
    Veteran Newcomer Subscriber

    Joined:
    Jun 25, 2014
    Messages:
    4,449
    Likes Received:
    3,778


    Looks like an overclocked 480, I'd expect faster memory at least given how Nvidia will be re-releasing the 1060 with faster memory.
     
    CSI PC likes this.
  2. gamervivek

    Regular Newcomer

    Joined:
    Sep 13, 2008
    Messages:
    693
    Likes Received:
    206
    Location:
    india
    Pretty disappointing, all the increased power draw and they still don't beat 1060. To add insult to the injury, it often doesn't even get to 1.5Ghz let alone go beyond that.

    A better end to this thread would be if AMD are preparing a polaris GPU with config based on Scorpio's hardware.
     
    Razor1, DavidGraham and Ike Turner like this.
  3. ImSpartacus

    Regular Newcomer

    Joined:
    Jun 30, 2015
    Messages:
    251
    Likes Received:
    199
    TR is showing the 580 offering slightly smoother frame rates.

    http://techreport.com/review/31754/amd-radeon-rx-580-and-radeon-rx-570-graphics-cards-reviewed/7

    [​IMG]

    Though that's with overclocked cards (on both "sides"):

    http://techreport.com/review/31754/amd-radeon-rx-580-and-radeon-rx-570-graphics-cards-reviewed/2

    TR does an infamously middling job of communicating the overclock of their tested gpus, but they technically state the models on the methodology page (and the proceed to magically pretend that they are generic versions of each card in every other exhibit in the review).

    Anandtech also has some articles.

    http://www.anandtech.com/show/11278/amd-radeon-rx-580-rx-570-review

    http://www.anandtech.com/show/11280/amd-announces-the-radeon-rx-500-series-polaris

    Cmon, did we really expect refreshed Polaris to magically compete with pascal on perf/watt? I honestly don't know if Vega would be true competition to gp104's ruthless efficiency. Pascal is a damn good architecture.


    I see refreshed Polaris acting like Grenada. Amd is pushing well out of their ideal operating conditions in order to get more performance. Perf/watt suffers.
     
    #3 ImSpartacus, Apr 18, 2017
    Last edited: Apr 18, 2017
    ToTTenTranz and Razor1 like this.
  4. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY

    Didn't expect it to beat Pascal, but yeah its acting exactly like Grenada, damn some of the overclocked versions are hitting over 200 watts for the card, that isn't acceptable for the kind of performance they give.
     
  5. Pressure

    Veteran Regular

    Joined:
    Mar 30, 2004
    Messages:
    1,290
    Likes Received:
    227
    OEMs should be pleased as 5X0 is a higher number than 4X0.

    Generally speaking, these should just have been named 485 and 475.
     
    homerdog, sebbbi and ToTTenTranz like this.
  6. CSI PC

    Veteran Newcomer

    Joined:
    Sep 2, 2015
    Messages:
    2,050
    Likes Received:
    844
    Sapphire I think are being a bit cheeky with their "Limited Edition" that sounds like they are selecting binned 580s for that range.
    The MSI has only slightly lower boost clock and averages slightly lower TBP.

    [​IMG]

    TBH I think that shows how poor Sapphire was with their 480 Nitro+ in terms of power demand.

    Here is the MSI 580 (blue) vs 480 STRIX (orange) vs 1060 (grey), context power consumption

    [​IMG]

    So does make the Limited Edition Sapphire 580 look as if it is not being that selective on binned parts.
    Cheers

    Edit:
    For reference in the last chart the Asus Strix 480 they use has a Boost clock of 1330MHz.
     
    #6 CSI PC, Apr 18, 2017
    Last edited: Apr 18, 2017
    Razor1 and ImSpartacus like this.
  7. CSI PC

    Veteran Newcomer

    Joined:
    Sep 2, 2015
    Messages:
    2,050
    Likes Received:
    844
    Actually I revise my point a little bit on the Sapphire Nitro 580 Limited Edition.
    It is binned looking at the voltage albeit still with high TDP, but this could be a good candidate if looking to watercool and if the card has enough power to feed the GPU for good OCing.
    Tom's show it peaks at 1.19V for 1450MHz while TPU show it uses 1.125V at 1411MHz.
    Just comes down to the cards limit for feeding power to the GPU and possibly needing watercooling at the higher frequencies-voltage.
    Although this also depends how well the voltage scales to 1.3V.
    Cheers
     
  8. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,523
    Likes Received:
    4,179
    As expected, people looking for good value are much better off looking at heavily discounted RX480 models than the latest RX580.
    Yes, AMD did release a lower MSRP for the RX500 series, but no one was following the MSRP for the RX400 anyways, so people are in reality just paying a lot more for what seems to be a tiny upgrade over the pre-existing RX480 cards with 3rd-party cooling solution.


    I think It was a smart move to up the TBP. No one (but tiny niches, nitpickers and nvidia fanboys) actually really cares if the cards are consuming 150W or 180W.
    30W more while playing games aren't going to break anyone's PSU (no one owns sub-400W PSUs on gaming PCs anyways) or electricity bill. They weren't going to win that TDP war against Pascal anyways, at least not with Polaris, so going with more performance per buck fits them better.


    I have no idea why AMD is calling it Polaris 20 behind the scenes. It just seems like a stupid trick. The only effective "upgrade" I'm seeing is the ability to clock down the memory in multimonitor setups (apparently not working yet) and high-bitrate videos (apparently working already). I bet there will be lots of RX480 models getting that with a BIOS update, though.

    Rebrand scheme should have use the +5 moniker (e.g. RX485), and I guess the only reason AMD didn't do this was because of OEM pressure to launch something "new and shiny". MSI is releasing a ridiculous amount of RX500 cards.




    BTW, beware of quick "RX580 vs GTX1060" conclusions. It seems every other site is reaching their own conclusions. Hardwarecanucks for example is painting the RX 580 as a clear winner in most of their games. Anandtech paints the complete opposite picture.


    IMHO, anandtech are pushing themselves into irrelevancy regarding graphics cards reviews with their games portfolio.
    3.5 year-old Battlefield 4, >4 year-old Crysis 3, 2 year-old GTA V?
    With so many recent shooters like Battlefield 1, COD: IW and Doom, why only include >3 year-old games?
     
    #8 ToTTenTranz, Apr 18, 2017
    Last edited: Apr 18, 2017
  9. Moloch

    Moloch God of Wicked Games
    Veteran

    Joined:
    Jun 20, 2002
    Messages:
    2,981
    Likes Received:
    72
    The average Joe may not care about the TDP of the 580 but it's a bit embarrassing that they need 100 watts more to match or beat the 1060 and the difference in power draw can mean having to buy a more expensive PSU, though needing more power to match or exceed Nvidia's performance has been the norm for Amd.
     
  10. CSI PC

    Veteran Newcomer

    Joined:
    Sep 2, 2015
    Messages:
    2,050
    Likes Received:
    844
    Yeah it is going to swing either way depending upon games used and resolution but the higher clocks will nudge it a bit more towards AMD, 1440p my money on a bet would be supporting the 580 as generally it can benefit more unless it is a game well optimised for Nvidia - maybe the 1060 refresh with higher speed memory may help.
    Regarding HardwareCanucks he used the single fan Superclocked 1060 rather than the dual fan SSC 1060 or any of the higher clocked non-throttling custom cards but then AMD 580 would still have a good showing as the cards would trade blows quite often before the refresh.
    And yeah Anandtech need to reconsider their games, including when reviewing CPUs.
    Cheers
     
    #10 CSI PC, Apr 18, 2017
    Last edited: Apr 18, 2017
  11. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,768
    Likes Received:
    1,496
    https://www.extremetech.com/gaming/...viewed-amd-takes-fight-gtx-1060-mixed-results
     
  12. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    15,553
    Likes Received:
    4,458
    Review consistency. This allows the user to easily compare new cards to cards they might already own to see how much of an upgrade they will get. I wouldn't want to see them replace the games they use, but it would certainly be nice if they added some newer games from time to time. Perhaps gradually phase out older games over time so that their benchmarking load doesn't become excessive.

    Regards,
    SB
     
    ArkeoTP, Entropy, CSI PC and 2 others like this.
  13. ImSpartacus

    Regular Newcomer

    Joined:
    Jun 30, 2015
    Messages:
    251
    Likes Received:
    199
    It looks like this is as bad as it will get.

    https://twitter.com/RyanSmithAT/status/854469124625375232

    Ryan just tweeted that they are refreshing their choices for Vega after someone brought up the BF4 issue.

    I have a hunch that their helpful "Bench" tool caused some of this. AT does a lot of testing and changing the benchmark suite causes the wealth of historic data to suddenly lose its relevant link to the present. Or maybe I'm just an Anandtech fanboy that's rationalizing. :3
     
    Lightman and Razor1 like this.
  14. Ryan Smith

    Regular Subscriber

    Joined:
    Mar 26, 2010
    Messages:
    596
    Likes Received:
    941
    Location:
    PCIe x16_1
    Aye. The benchmark suite gets updated once per year, generally around a new product/architecture launch. This year it'll get updated for Vega. (And to be clear, this has been the plan for a while now)

    Bench and consistency in general. There's nearly 40 cards in Bench; with a yearly rotation, by definition it has taken nearly a year to collect all of that data. But even if Bench didn't exist, I want some consistency so that you can go back to say the GTX 1080 Ti review and be able to reasonably compare results.
     
    #14 Ryan Smith, Apr 19, 2017
    Last edited: Apr 19, 2017
  15. Infinisearch

    Veteran Regular

    Joined:
    Jul 22, 2004
    Messages:
    739
    Likes Received:
    139
    Location:
    USA
    Could you tell us what the new games will be?
     
  16. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,428
    Likes Received:
    548
    Location:
    WI, USA
    What a thrilling retread of a prior boring release.

    Launch Vega already.
     
  17. Transponster

    Newcomer

    Joined:
    Feb 24, 2016
    Messages:
    74
    Likes Received:
    13
    In advanced society this would be downright illegal, selling highly inefficient products like these to unsuspecting customers is socially irresponsible, to say the least.
     
  18. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    15,553
    Likes Received:
    4,458
    I know right? Just like the Geforce FX 5900, GTX 480, and Radeon HD 2900! :p Hmmm, where'd I put that sarcasm tag?

    Regards,
    SB
     
    3dcgi and RootKit like this.
  19. Clukos

    Clukos Bloodborne 2 when?
    Veteran Newcomer Subscriber

    Joined:
    Jun 25, 2014
    Messages:
    4,449
    Likes Received:
    3,778
    The 970 might actually be more power efficient at 28nm than the 580 is at 14nm, or even if it isn't it's probably close enough

    [​IMG]

    That is bizarre. Hopefully Vega addresses perf/watt more than it does raw performance.
     
    homerdog likes this.
  20. Lightman

    Veteran Subscriber

    Joined:
    Jun 9, 2008
    Messages:
    1,791
    Likes Received:
    458
    Location:
    Torquay, UK
    @Transponster
    Clearly you went out of your way to pick worst case scenario for Polaris power consumption, ommiting efficiency of that chip at things it does well. Compute is one of the stronger aspects, not to mention that even at gaming, power load will vary heavily between games and even locations in said games.
    So yes, I'm fully supporting your closed minded view of the world and vote to ban all server processors too while we are at it as they suck big time when you try to game on 18 core Xeons.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...