CryptoCurrency Mining with GPUs *spawn*

Smells fishy to me.
If these numbers are achievable (43MH/s @<180w) there will be dozens on confirmations and how to videos forthcoming.

Hwinfo is meaningless. Measure at the wall.
Exactly, WCCTech duplicated the results, but with much higher power consumption measured at the wall:
We tried to use identical settings to the tester (mining drivers were used): 1000MHz core at 1000mV, 1100MHz Memory, and -25 Power Limit. We were easily able to get the 43.5 mHash/s rate that the tester got but our power specs were a whole different story. While HWInfo was happily reporting a power draw of ~130 Watts, the Kill-A-Watt meter told a different story. Our test bed had an idle power draw of around 138 Watts and forms the base line for our test. The power draw at load however was 386 Watts. This yields a delta of ~248 watts, which is obviously significantly higher than what the reports claim to have achieved with the card.
http://wccftech.com/amds-rx-vega-64-etherium-mining-248-watts-of-power-draw/
 
Wccftech's results are measured at the wall (not at the PSU level) and they're using a substantially higher core clock.
I have no idea why wccftech would use ~1150MHz core clocks instead of the reddit post's ~950MHz. Besides, 248W is substantially more than what a Vega 64 consumes using the Power Saving profile (which is just a little above 210W) that puts the core clock at an average 1250MHz. So compared to the Power Saving profile not only did they manage to decrease the core's efficiency but they did so at lower core clocks.
It doesn't look like wccftech are any experts at tweaking voltage and clocks for mining..


The recent 130W->140W change must be taking into account some ~20W of the two cards when idling. Which HWinfo now properly detects.
Regardless, he should just make these measurements at the wall.
 
Yup, the Wccftech measurement also does not look trustworthy, IMO.

I guess I'll have to wait on myself geting one or on more reliable sites with some expertise in cryptomining (eg whattomine) publishing their results
 
Last edited:
Yup, the Wccftech measurements also does not look trustworthy, IMO.

I guess I'll have to wait on myself geting one or on more reliable sites with some expertise in cryptomining (eg whattomine) publishing their results

What about the Wccftech results look untrustworthy?
 
They didn't mention any effort to reduce power consumption (e.g. power target setting, undervolt ammount). They seemed to be focused only on reaching the mh/s numbers.

Edit : But now the article seems to be updated yet again, addresing my concerns
 
They didn't mention any effort to reduce power consumption (e.g. power target setting, undervolt ammount). They seemed to be focused only on reaching the mh/s numbers.


I'm confused with your conclusion since they state:

"We tried to use identical settings to the tester (mining drivers were used): 1000MHz core at 1000mV, 1100MHz Memory, and -25 Power Limit (Update: -40% power limit). We were easily able to get the 43.5 mHash/s rate (40 mHash/s with the -40% power limit) that the tester got but our power specs were a whole different story. While HWInfo was happily reporting a power draw of ~130 Watts, the Kill-A-Watt meter told a different story. Our test bed had an idle power draw of around 138 Watts and forms the base line for our test. The power draw at load however was 386 Watts (Update: 330 Watts with the new test). This yields a delta of ~248 watts (Update: ~200 Watt), which is obviously significantly higher than what the reports claim to have achieved with the card."
 
I'm confused with your conclusion since they state:

"We tried to use identical settings to the tester (mining drivers were used): 1000MHz core at 1000mV, 1100MHz Memory, and -25 Power Limit (Update: -40% power limit). We were easily able to get the 43.5 mHash/s rate (40 mHash/s with the -40% power limit) that the tester got but our power specs were a whole different story. While HWInfo was happily reporting a power draw of ~130 Watts, the Kill-A-Watt meter told a different story. Our test bed had an idle power draw of around 138 Watts and forms the base line for our test. The power draw at load however was 386 Watts (Update: 330 Watts with the new test). This yields a delta of ~248 watts (Update: ~200 Watt), which is obviously significantly higher than what the reports claim to have achieved with the card."

They 'Update:' this article every 15 minutes so it's hard to follow!
Latest 200W is not that far off HWInfo 140W when you take into account PSU efficiency and how much power non-idle system (CPU, motherboard, drives, etc) take. These can easily swing readings by a lot between platforms. Ideally we would need a test of pure GPU power draw separated from rest of the system using clamps on PCIe slot and PCIe 8 pin connectors.
 
@Mize the original article claimed 248w for their Vega 64 which as I pointed out above it's a very large power consumption for the card (above the Power Saving profile). In the meanwhile they've launched another article and as you point out with the bolded updates they're still doing quite a bit of trial and error. Not only that but they're measuring at the wall with no mention of power supply efficiency.
 
They 'Update:' this article every 15 minutes so it's hard to follow!
Latest 200W is not that far off HWInfo 140W when you take into account PSU efficiency and how much power non-idle system (CPU, motherboard, drives, etc) take. These can easily swing readings by a lot between platforms. Ideally we would need a test of pure GPU power draw separated from rest of the system using clamps on PCIe slot and PCIe 8 pin connectors.

That's overkill. Just measure consumption at the wall on integrated graphics then install Vega, start mining and compare.

Repeat with 580 & 1070.
 
That's overkill. Just measure consumption at the wall on integrated graphics then install Vega, start mining and compare.

Repeat with 580 & 1070.

I agree on the 580 and 1070 part but not integrated. Integrated GPU will have no mining load, hence CPU will be totally idle. My Ryzen adds about 20W from true idle to very light load and different CPU's and motherboards will show different delta.
If WCCF run the same test using same platform, just change GPU to GF1070 and compare deltas then it will give us much better comparison, I agree!
 
That's overkill. Just measure consumption at the wall on integrated graphics then install Vega, start mining and compare.

Repeat with 580 & 1070.
iGPUs don't consume 0W when being used, and not everyone will be using the same PSUs with the same efficiency. Knowing the actual power consumption of a card with a decent precision is very important for miners, especially if they're going to order dozens/hundreds of those.
 
Claymore 10.0 has been out for a few days.

Highlighted changes:

- added assembler kernels for ETH+LBC mining mode (AMD cards only), major speedup for LBC.
- about 1% ETH speedup for Vega cards.
- fixed issues with voltage/clocks management for latest AMD blockchain drivers (not completely).
- new GPU sorting method for NVIDIA cards. Now GPUs are sorted by physical bus index (it matches AfterBurner list of GPUs).
- for ETH+LBC mining mode maximal "dcri" value is 1000 now.
- added "-platform" option.
- added "ESTALE" option support in failover file epools.txt (see "-estale" option for details).
- several minor bug fixes and improvements.
 
Cryptomining needs to die, it only raises prices of the cards that we gamers/uber nerds/prosumers wanna buy. And selling to miners numbers is not enough to make to GPU makers substantial profit.
 
And selling to miners numbers is not enough to make to GPU makers substantial profit.
How do you figure that? They basically haven't been able to meet demand and selling them for more than normal so they're actually making much better profits?
 
On the contrary, also for practical reasons since you cannot force it to die: GPU-feasible crypto-mining needs to grow and prosper quickly so Nvidia and AMD will be motivated to bring specialized mining chips (in the sense of what GV100 is for AI, on a smaller scale though) to market. Funds raised there can be used to better GPU development in general from which also gamers profit.
 
Back
Top