AMD RX580 Reviews

@Transponster
Clearly you went out of your way to pick worst case scenario for Polaris power consumption

The average power consumption is even worse

power_average.png


What do you mean about worst case scenario? AMD released the 580 like that themselves, nobody demanded for them to refresh Polaris afaik.
 
The average power consumption is even worse

power_average.png


What do you mean about worst case scenario? AMD released the 580 like that themselves, nobody demanded for them to refresh Polaris afaik.

I'm not defending this product as for me the last 2 gen. of high end GPUs from AMD didn't cut it, just the totalitarian notion of state imposed 'criminal' metric. There are different shades of grey too.

Regarding your point of no one asked AMD to release RX580, I disagree. This is not how OEM and market in general works. I don't like it though
 
I have read most of the reviews of this card, its at least 40% to 66% worse in power efficiency than the 1060(anywhere from 60 watts all the way to 100 watt differential), it definitely puts it in the range of the gtx 970 perf/watt and less than. This puts this Polaris card vs Pascal is much worse than Fiji or r390x vs Maxwell!

There is no defending this. Anyone trying to defend it can look at many if not ALL the reviews to see its not just TPU that came up with the differential. Shit, anything above 180 watts that is 1080 level, and this card is pulling over 200 watts at times, that's 1080ti level! Polaris's power efficiency SUCKS, its pushed way to high and its out of its range.
 
Has anyone tried to downclock/downvolt the RX 580 to see if Polaris 20 brings any efficiency improvement over Polaris 10 at the same clock speed?
 
New Has anyone tried to downclock/downvolt the RX 580 to see if Polaris 20 brings any efficiency improvement over Polaris 10 at the same clock speed?
Anandtech did, with mixed results.

Though I don't think @Ryan Smith altered the RX580's vcore to the same values as the reference RX480, which IIRC is a bit lower in the latter. Pretty much like Hawaii -> Grenada, it'll probably come down to BIOS adjustments and small manufacturing advances.
 
AdoredTV got some very interesting results from undervolting.
That's not just undervolting, it's the Radeon Chill working as intended.
Chill is awesome, and it might even more than compensate for the power consumption differences to the GTX1060 measured in the reviews.

New i dont wanna watch a video tell me what happens pls
Undervolting brings GPU power consumption down to 145-150W. Radeon Chill on WoW (ideal title for its simplicity though) gets it between 40 and 70W.


I hope AMD can eventually make Radeon Chill robust enough to just have it enabled by default by setting the framerate target to the monitor's highest refresh rate.
So many people with 60Hz monitors thinking they're getting something from 200 FPS.
 
I hope AMD can eventually make Radeon Chill robust enough to just have it enabled by default by setting the framerate target to the monitor's highest refresh rate.
So many people with 60Hz monitors thinking they're getting something from 200 FPS.


AMD already has that tech and its not Radeon Chill. FRTC.

Radeon Chill the way it works, just won't be able to do it all the time, so no it will never be able to compete with the 1060 in power consumption, unless you just don't move around or go in straight line.
 
That's not just undervolting, it's the Radeon Chill working as intended.
Chill is awesome, and it might even more than compensate for the power consumption differences to the GTX1060 measured in the reviews.


Undervolting brings GPU power consumption down to 145-150W. Radeon Chill on WoW (ideal title for its simplicity though) gets it between 40 and 70W.


I hope AMD can eventually make Radeon Chill robust enough to just have it enabled by default by setting the framerate target to the monitor's highest refresh rate.
So many people with 60Hz monitors thinking they're getting something from 200 FPS.

Please watch the video again, in ashes undervolting got the power down to 130W-140W, heat decreased by 6Âş and performance was better (GPU was able to keep the maximum boost).

On WoW that value 145W-150W was with default voltage. Undervolting shaved roughly 30W from that and then Chill halved it.
 
I think Radeon Chill really needs to monitored correctly with the right power measurement tools, which only 3 sites do.
This is Tom's result with Witcher 3 and tbh it really cannot be compared in any way to power consumption behaviour for an efficient design such as Pascal that is much more consistent in all situations.

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9WL1QvNjM0ODg5L29yaWdpbmFsLzA2LVBvd2VyLUNvbnN1bXB0aW9uLnBuZw==


Even if the gains are better when measured with scopes on a 580 (only PCPer and Tom's use oscilloscopes for measuring power behaviour, with Hardware.fr next closest in terms of analysis) it will still suffer greatly from being a more dynamic variable behaviour.
And the games involved will of course impact this differently and the benefit from Radeon Chill, along with at times unpredictable and inconsistent performance from a gaming perspective; I know a few who find it stutters/judder type effect for their playstyle with a supported MOBA game if they enable it.
Undervolting could be considered for all GPU manufacturers, for Nvidia more so with their higher tier GPUs.
Cheers
 
Last edited:
Games like Civilization (don't know if it's supported BTW) would probably get immense gains from Chill. As shown in the video above, simpler (yet very popular) games like WoW get in the same boat as well.
This would need to be a case-by-case study and it could vary immensely from gamer to gamer.

My point is for someone who plays lots of Civilization and WoW, using a RX480 with Chill at 40-75FPS (typical Freesync range BTW) could result in a lower total power consumption to someone using an identical system but with a GTX 1060.
For someone playing lots of Battlefield 1 and COD the GTX1060 would obviously get a lower power consumption.


Please watch the video again, in ashes undervolting got the power down to 130W-140W, heat decreased by 6Âş and performance was better (GPU was able to keep the maximum boost).

On WoW that value 145W-150W was with default voltage. Undervolting shaved roughly 30W from that and then Chill halved it.
You might be right. I confess I skipped through the video a bit because the guy's accent makes my brain hurt from trying to understand what he's saying.
:oops:
 
My point is for someone who plays lots of Civilization and WoW, using a RX480 with Chill at 40-75FPS (typical Freesync range BTW) could result in a lower total power consumption to someone using an identical system but with a GTX 1060.

You are aware that GF cards do clock down dynamically as well, as long as you do not select "prefer maximum performance"? Even more so with framelimiter or VSync in place. But I agree that it would be an interesting comparison to see if AMDs profile-based technique has a more pronounced effect here.
 

Mindless data dump that results in a terrible and very ignorant article.
I never heard of that website and I guess there's a really good reason to.

1 - The SteamVR test obviously didn't have the "-multigpu" command enabled, so he's probably just running AFR forcing both cards to render two viewpoints each, which is a terrible idea. Using 2*290X, I get the full 11 score with no drops using the multigpu command, and I doubt two RX580 would get him less than that. A 2-minute google search would have given him that information.

2 - Ashes of Singularity supports explicit multi-GPU and it gets very good scaling on both brands, but explicit mGPU means he'd have to disable Crossfire in the settings. The guy didn't bother doing a 1-minute google search before doing the tests, so he gets zero scaling.

3 - Doom doesn't support mGPU in Vulkan. A 30-second google search would have told him that. Let's just spend tens of hours testing 20+ graphics cards to reach that conclusion instead.

4 - Deus Ex Mankind Divided gets ridiculous scaling using DX12, but let's just test it on the DX11 runtime that doesn't support mGPU, because reasons. Or rather because the guy lacks google search skills.



What really bothers me is the guy somehow has access to a myriad of graphics cards but uses them to make largely ignorant articles.

It's not like multi-gpu is great for everyone nowadays. It largely depends on what games one is willing to play, really. Two RX580 at their current prices are certainly not worth it IMO, but if one could get two 8GB RX480 for $180 each (like I got one a couple of weeks ago), or even two 8GB RX470 for $150 each using rebate promotions and the like, then it's definitely worth a look.
 
I'm definetly going for an RX 550 as a sidegrade until Vega comes & spare afterwards. I *need* a replacement for my gtx 460 which is noisy, has an aging video decode engine and can only provide 30 hz refresh rate for one of my monitors.

Tempted to buy one right now but I'd very much like a passively-cooled design.
 
Back
Top