AMD: Speculation, Rumors, and Discussion (Archive)

Status
Not open for further replies.
He used powered riser cards and ran 3 graphic cards (RX 480) through the ATX cable. It could have been weakened over time as he apparently ran the setup with 3 X Radeon 290X before.
He also stated this happened when he was not at home. If it had caused a fire with children in the home it would have been negligence on AMD's part.
 
He used powered riser cards and ran 3 graphic cards (RX 480) through the ATX cable. It could have been weakened over time as he apparently ran the setup with 3 X Radeon 290X before.
He used UN-powered riser cards. If he had used powered ones, the ATX cable wouldn't have to keep up with the +250W of load.

Even though if this happened with +250W, it would as well have happened with the plain +225W which would have been within the spec.
 
Let's just hope this doesn't happen again ...



https://bitcointalk.org/index.php?topic=1433925.msg15438647#msg15438647

That one was fishy as hell because he claims he had 3*RX 480 in that board.. which has only 2 PCIe x16 connectors and he couldn't use one of the X1 connectors because then he wouldn't have space for all the 2-slot coolers. Unless he somehow also had one of those waterblocks that AFAIK aren't being sold yet.

EDIT: Apparently he was using riser cards.
 
That one was fishy as hell because he claims he had 3*RX 480 in that board.. which has only 2 PCIe x16 connectors and he couldn't use one of the X1 connectors because then he wouldn't have space for all the 2-slot coolers. Unless he somehow also had one of those waterblocks that AFAIK aren't being sold yet.

EDIT: Apparently he was using riser cards.
Yes. He was using unpowered riser cards. And that's the +3.3V plus one GND pin which burnt down for him, not the +12V.

EDIT: Nope. That actually were the 2 12V pins. Forgot to mirror the the diagram.
 
It would have also failed using one single card pulling those many watts, such as a heavily overclocked 290x.

also, he was mining eth + decred (dual mining). It puts the same sort of load as furmark.
 
I need to dig around a bit, but whenever someone has released a benchmark showing some of the older cards, AMD ones have been doing relatively better than their NVIDIA counterparts of the day

Newest thing I found on our site:
http://www.pcgameshardware.de/Star-...950/Specials/Beta-Technik-Benchmarks-1173656/
Why my colleagues didn't include the GTX 470 in the final review version, I do not know. HD 6950 got a little slower there:
http://www.pcgameshardware.de/Star-Wars-Battlefront-Spiel-34950/Specials/Test-Benchmark-1178154/

Maybe soon(tm) we'll see more generational overviews. :)
 
He used UN-powered riser cards. If he had used powered ones, the ATX cable wouldn't have to keep up with the +250W of load.

Even though if this happened with +250W, it would as well have happened with the plain +225W which would have been within the spec.
In standard form the ATX12V 12-pin can take around 144W, this is because it is rated at 6A per contact if it is not the HCS implementation.
HCS I think increases that to 9A giving total of 216W for the 24-pin connector (none of this is applicable to the PCIe slot in context of an HCS rating).

He should really check before using 3xGPU in an un-powered riser, it is fair to expect that the average Watts for the mainboard 12V would be around 35-45W per card for an average total of 135W in the higher characteristic, but that should never be taken for granted..
Anyway, I do think the concern around any of this is longer term use and then pulling out cards/putting new ones in; that is where one would more likely see the 480s a motherboard a few years from now when replaced - Caveat being this would only be applicable to a minority of cases IMO and excluding mGPU (especially insane 3-cards) or OC/power target in this context.

Just to clarify some other posts, he used 3 x 280X.
No, it's an Asus P7P55-LX, it was the 1st rig I built. Ran for 3 years with 3 280x and non-powered risers. 6 hours with the 480s and poof!!

The worst 290x I found measured was the MSI with around 32.5W average pulled through the mainboard PCIe slot and total 258W all power connections.
The 380X Nitro pulls in worst case scenario (Metro Last Light 4K) 48W average through the mainboard PCIe slot (total all power connections 191W) - that would possibly be putting a long term stress on the ATX12V 12V motherboard power without extra power if going 3 of those, but also depends if that board model is HCS rated.
The 280X has no detailed analysis and only shows a total of 207W all power connections, above the 380X Nitro but we cannot assume if the distribution is more like the 290X or the 380X.
So it is possible the 280X was at the limit or very close to the motherboard capability and then the situation finally with the combined average of 246W through the ATX12V 24-pin connector.
However an important point, this goes to show that one cannot rely upon the ATX power protection to shut down the machine before a catastrophic failure.

The biggest point though, at least AMD has a fix coming very soon.
Cheers
 
Last edited:
FWIW, Metro is medium load at best and far from a worst case scenario.
We've found that our Anno-2070-savegame as well as the one from Risen 3 produce the highest power draw from all games in our testing regimen* (we're using Crysis 3 as an additional intermediate case).
That said, 290(X)'s were not that hard on Slot-12v. Maximum I'v found in our database (we measure the individual rails since... ten years or so) was a Sapphire R9 290X Tri-X OC 8GB which drew about 3.6A. A Sapphire R9 380X Nitro was close to max at 5.45A.

R9 280's were mostly between 1.9 and 2.7A, with some OC-models as high as 4,7A.


edit:
Actually, out of a couple more games, since we did the last greater roundup before choosing the games for our standard testing procedures, so the candiates were in there too.
Out of Anno 2070, AC: Unity,Bioshock Infinite,Crysis 3,Crysis Warhead,Dragon Age Inquisition,Ethan Carter,Far Cry 4,F1 2015,Grid Autosport,GTA 5,Metro LL Redux,Project Cars,Risen 3,Mordor,Shadow Warrior,Skyrim,Watch Dogs,Witcher 3 and Wolfenstein ToB, we had for example the Fury X averagig 266,9 Watts in total and the 980 Ti 221,3 watts (graphics cards only, both reference models). Metro was at 270,6 and 224,9 respectively, so just slightly above average.
 
Last edited:
Newest thing I found on our site:
http://www.pcgameshardware.de/Star-...950/Specials/Beta-Technik-Benchmarks-1173656/
Why my colleagues didn't include the GTX 470 in the final review version, I do not know. HD 6950 got a little slower there:
http://www.pcgameshardware.de/Star-Wars-Battlefront-Spiel-34950/Specials/Test-Benchmark-1178154/

Maybe soon(tm) we'll see more generational overviews. :)
Seems like at least in that game the line should be drawn in GCN after all, but the HD5/6 haven't age any worse either. Could be just my memory failing, or I'm thinking time long before those
 
Seems like at least in that game the line should be drawn in GCN after all, but the HD5/6 haven't age any worse either. Could be just my memory failing, or I'm thinking time long before those
I can remember tests implying this between Radeon-X1K-Series vs. Geforce 7.
 
Readily available here, except for the 4 GB version, of which apparently only a small initial batch hit the shops to satisfy the claims of $199. Partners should follow up in the coming weeks.
 
Readily available here, except for the 4 GB version, of which apparently only a small initial batch hit the shops to satisfy the claims of $199. Partners should follow up in the coming weeks.
Well I can't buy a 480 from europe, I mean I can but then the shipping will be stupid. Nor I can pay 320 for a 480 either. AMD I want to give you my money, why you don't wanna accept it!!!!? :runaway:

I think I will just wait for the 470 and see how the market reacts. Still can't believe that 380x are about 200 dollars and 300 for 970s...:oops:
 
Well the hack is programming/changing parameters in the PWM Controller.
In the 1500MHz overclock I linked earlier, they also had to do a similar thing to change power related power settings, they used a nice tool called Elmor EVC, basically very similar but has both hardware+software interface with I2C to the PWM controller so one has much better flexibility over the options.
In the Elmor thing, they use a HW I2C master, anything is fair game there. But I'm talking specifically of pure SW access: I thought that critical power control infrastructure would be protected by requiring signed binaries or cookies to avoid tampering.

It is pretty standard communication, and I would say the bios viruses are more likely an issue IMO.
Good point. I thought that those has to be signed as well. ;)
 
In the Elmor thing, they use a HW I2C master, anything is fair game there. But I'm talking specifically of pure SW access: I thought that critical power control infrastructure would be protected by requiring signed binaries or cookies to avoid tampering.
The Stilt showed how to do it with MSI Afterburner and commandline, and also has provided custom BIOS with the same "hack" built-in so you don't need Afterburner to do it
 
Status
Not open for further replies.
Back
Top