AMD RX580 Reviews

Clukos

Bloodborne 2 when?
Veteran
Supporter

Looks like an overclocked 480, I'd expect faster memory at least given how Nvidia will be re-releasing the 1060 with faster memory.
 
Pretty disappointing, all the increased power draw and they still don't beat 1060. To add insult to the injury, it often doesn't even get to 1.5Ghz let alone go beyond that.

A better end to this thread would be if AMD are preparing a polaris GPU with config based on Scorpio's hardware.
 
Pretty disappointing, all the increased power draw and they still don't beat 1060. To add insult to the injury, it often doesn't even get to 1.5Ghz let alone go beyond that.

A better end to this thread would be if AMD are preparing a polaris GPU with config based on Scorpio's hardware.

TR is showing the 580 offering slightly smoother frame rates.

http://techreport.com/review/31754/amd-radeon-rx-580-and-radeon-rx-570-graphics-cards-reviewed/7

99thvalue.png


Though that's with overclocked cards (on both "sides"):

http://techreport.com/review/31754/amd-radeon-rx-580-and-radeon-rx-570-graphics-cards-reviewed/2

TR does an infamously middling job of communicating the overclock of their tested gpus, but they technically state the models on the methodology page (and the proceed to magically pretend that they are generic versions of each card in every other exhibit in the review).

Anandtech also has some articles.

http://www.anandtech.com/show/11278/amd-radeon-rx-580-rx-570-review

http://www.anandtech.com/show/11280/amd-announces-the-radeon-rx-500-series-polaris

185 watts for stock 1340, that is crazy. That is similar to what we saw of the rx 480 at 1340 mhz

Yeah gtx 1080 power draw for a mid range card.

Cmon, did we really expect refreshed Polaris to magically compete with pascal on perf/watt? I honestly don't know if Vega would be true competition to gp104's ruthless efficiency. Pascal is a damn good architecture.


I see refreshed Polaris acting like Grenada. Amd is pushing well out of their ideal operating conditions in order to get more performance. Perf/watt suffers.
 
Last edited:
TR is showing the 580 offering slightly smoother frame rates.

http://techreport.com/review/31754/amd-radeon-rx-580-and-radeon-rx-570-graphics-cards-reviewed/7

99thvalue.png


Though that's with overclocked cards (on both "sides"):

http://techreport.com/review/31754/amd-radeon-rx-580-and-radeon-rx-570-graphics-cards-reviewed/2

TR does an infamously middling job of communicating the overclock of their tested gpus, but they technically state the models on the methodology page (and the proceed to magically pretend that they are generic versions of each card in every other exhibit in the review).

Anandtech also has some articles.

http://www.anandtech.com/show/11278/amd-radeon-rx-580-rx-570-review

http://www.anandtech.com/show/11280/amd-announces-the-radeon-rx-500-series-polaris



Cmon, did we really expect refreshed Polaris to magically compete with pascal on perf/watt? I honestly don't know if Vega would be true competition to gp104's ruthless efficiency. Pascal is a damn good architecture.


I see refreshed Polaris acting like Grenada. Amd is pushing well out of their ideal operating conditions in order to get more performance. Perf/watt suffers.


Didn't expect it to beat Pascal, but yeah its acting exactly like Grenada, damn some of the overclocked versions are hitting over 200 watts for the card, that isn't acceptable for the kind of performance they give.
 
Sapphire I think are being a bit cheeky with their "Limited Edition" that sounds like they are selecting binned 580s for that range.
The MSI has only slightly lower boost clock and averages slightly lower TBP.

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS8wL0kvNjY4NzU0L29yaWdpbmFsLzAwLVdhdHRhZ2UtT3ZlcnZpZXcucG5n


TBH I think that shows how poor Sapphire was with their 480 Nitro+ in terms of power demand.

Here is the MSI 580 (blue) vs 480 STRIX (orange) vs 1060 (grey), context power consumption

power-witcher3.png


So does make the Limited Edition Sapphire 580 look as if it is not being that selective on binned parts.
Cheers

Edit:
For reference in the last chart the Asus Strix 480 they use has a Boost clock of 1330MHz.
 
Last edited:
Actually I revise my point a little bit on the Sapphire Nitro 580 Limited Edition.
It is binned looking at the voltage albeit still with high TDP, but this could be a good candidate if looking to watercool and if the card has enough power to feed the GPU for good OCing.
Tom's show it peaks at 1.19V for 1450MHz while TPU show it uses 1.125V at 1411MHz.
Just comes down to the cards limit for feeding power to the GPU and possibly needing watercooling at the higher frequencies-voltage.
Although this also depends how well the voltage scales to 1.3V.
Cheers
 
As expected, people looking for good value are much better off looking at heavily discounted RX480 models than the latest RX580.
Yes, AMD did release a lower MSRP for the RX500 series, but no one was following the MSRP for the RX400 anyways, so people are in reality just paying a lot more for what seems to be a tiny upgrade over the pre-existing RX480 cards with 3rd-party cooling solution.


I think It was a smart move to up the TBP. No one (but tiny niches, nitpickers and nvidia fanboys) actually really cares if the cards are consuming 150W or 180W.
30W more while playing games aren't going to break anyone's PSU (no one owns sub-400W PSUs on gaming PCs anyways) or electricity bill. They weren't going to win that TDP war against Pascal anyways, at least not with Polaris, so going with more performance per buck fits them better.


I have no idea why AMD is calling it Polaris 20 behind the scenes. It just seems like a stupid trick. The only effective "upgrade" I'm seeing is the ability to clock down the memory in multimonitor setups (apparently not working yet) and high-bitrate videos (apparently working already). I bet there will be lots of RX480 models getting that with a BIOS update, though.

Rebrand scheme should have use the +5 moniker (e.g. RX485), and I guess the only reason AMD didn't do this was because of OEM pressure to launch something "new and shiny". MSI is releasing a ridiculous amount of RX500 cards.




BTW, beware of quick "RX580 vs GTX1060" conclusions. It seems every other site is reaching their own conclusions. Hardwarecanucks for example is painting the RX 580 as a clear winner in most of their games. Anandtech paints the complete opposite picture.



IMHO, anandtech are pushing themselves into irrelevancy regarding graphics cards reviews with their games portfolio.
3.5 year-old Battlefield 4, >4 year-old Crysis 3, 2 year-old GTA V?
With so many recent shooters like Battlefield 1, COD: IW and Doom, why only include >3 year-old games?
 
Last edited by a moderator:
The average Joe may not care about the TDP of the 580 but it's a bit embarrassing that they need 100 watts more to match or beat the 1060 and the difference in power draw can mean having to buy a more expensive PSU, though needing more power to match or exceed Nvidia's performance has been the norm for Amd.
 
BTW, beware of quick "RX580 vs GTX1060" conclusions. It seems every other site is reaching their own conclusions. Hardwarecanucks for example is painting the RX 580 as a clear winner in most of their games. Anandtech paints the complete opposite picture.

IMHO, anandtech are pushing themselves into irrelevancy regarding graphics cards reviews with their games portfolio.
3.5 year-old Battlefield 4, >4 year-old Crysis 3, 2 year-old GTA V?
With so many recent shooters like Battlefield 1, COD: IW and Doom, why only include >3 year-old games?

Yeah it is going to swing either way depending upon games used and resolution but the higher clocks will nudge it a bit more towards AMD, 1440p my money on a bet would be supporting the 580 as generally it can benefit more unless it is a game well optimised for Nvidia - maybe the 1060 refresh with higher speed memory may help.
Regarding HardwareCanucks he used the single fan Superclocked 1060 rather than the dual fan SSC 1060 or any of the higher clocked non-throttling custom cards but then AMD 580 would still have a good showing as the cards would trade blows quite often before the refresh.
And yeah Anandtech need to reconsider their games, including when reviewing CPUs.
Cheers
 
Last edited:
What the RX 580 does prove, beyond any shadow of a doubt, is that the GCN architecture in its various incarnations from 1.0 – 1.4 is tapped out. Polaris got a kick from a new process node and better clock scaling, but AMD is slamming into the same problem they had with GCN at 28nm: This GPU is not designed to hit high clock speeds, and it can’t do so without blowing out the power consumption improvements that the RX 480 delivered almost a year ago.
...
Unfortunately, higher-end AMD customers aren’t going to find a lot to like here. If you didn’t consider the RX 480 sufficient reason to upgrade from an R9 290/290X/390/390X, you probably won’t be sold on the RX 580, either. And the comparison against the GTX 1060 just isn’t great: The RX 580 is 96 percent as fast as the GTX 1060 at 1080p and 98 percent as fast at 1440p across both DX11, DX12, and Vulkan. In DX12/Vulkan, the situation improves somewhat. Here, the RX 580 is 1.03x faster at 1080p and just 1.01x faster at 1440p. It’s hard to get too excited about AMD matching the GTX 1060 tit-for-tat when the latter is nine months old and the former only manages it at drastically increased power consumption.
https://www.extremetech.com/gaming/...viewed-amd-takes-fight-gtx-1060-mixed-results
 
With so many recent shooters like Battlefield 1, COD: IW and Doom, why only include >3 year-old games?

Review consistency. This allows the user to easily compare new cards to cards they might already own to see how much of an upgrade they will get. I wouldn't want to see them replace the games they use, but it would certainly be nice if they added some newer games from time to time. Perhaps gradually phase out older games over time so that their benchmarking load doesn't become excessive.

Regards,
SB
 
IMHO, anandtech are pushing themselves into irrelevancy regarding graphics cards reviews with their games portfolio.
3.5 year-old Battlefield 4, >4 year-old Crysis 3, 2 year-old GTA V?
With so many recent shooters like Battlefield 1, COD: IW and Doom, why only include >3 year-old games?

It looks like this is as bad as it will get.

https://twitter.com/RyanSmithAT/status/854469124625375232

Ryan just tweeted that they are refreshing their choices for Vega after someone brought up the BF4 issue.

We'll be refreshing the benchmark suite for the Vega launch

I have a hunch that their helpful "Bench" tool caused some of this. AT does a lot of testing and changing the benchmark suite causes the wealth of historic data to suddenly lose its relevant link to the present. Or maybe I'm just an Anandtech fanboy that's rationalizing. :3
 
Ryan just tweeted that they are refreshing their choices for Vega after someone brought up the BF4 issue.
Aye. The benchmark suite gets updated once per year, generally around a new product/architecture launch. This year it'll get updated for Vega. (And to be clear, this has been the plan for a while now)

I have a hunch that their helpful "Bench" tool caused some of this. AT does a lot of testing and changing the benchmark suite causes the wealth of historic data to suddenly lose its relevant link to the present. Or maybe I'm just an Anandtech fanboy that's rationalizing. :3
Bench and consistency in general. There's nearly 40 cards in Bench; with a yearly rotation, by definition it has taken nearly a year to collect all of that data. But even if Bench didn't exist, I want some consistency so that you can go back to say the GTX 1080 Ti review and be able to reasonably compare results.
 
Last edited:
In advanced society this would be downright illegal, selling highly inefficient products like these to unsuspecting customers is socially irresponsible, to say the least.
 
The 970 might actually be more power efficient at 28nm than the 580 is at 14nm, or even if it isn't it's probably close enough

1y7svh.png


That is bizarre. Hopefully Vega addresses perf/watt more than it does raw performance.
 
@Transponster
Clearly you went out of your way to pick worst case scenario for Polaris power consumption, ommiting efficiency of that chip at things it does well. Compute is one of the stronger aspects, not to mention that even at gaming, power load will vary heavily between games and even locations in said games.
So yes, I'm fully supporting your closed minded view of the world and vote to ban all server processors too while we are at it as they suck big time when you try to game on 18 core Xeons.
 
Back
Top