CryptoCurrency Mining with GPUs *spawn*

*Knee jerk reaction:* There's 0.2 EUR tax on a kWh in Germany, only 0.05 in Romania.
Cheers
Almost as bad as you describe. About 53% are taxes and similar fees. About 24% for infrastructure maintenance, accounting and similar. Remaining 23% are what the providers or resellers get for their "efforts" of buying/producing electricity, their margins etc.
 
Glad you identified the 128W Carsten :)

As a reference here is a comparison by Tom's Hardware using a scope to measure power demand of several generations of AMD dual cards, interesting article:
http://www.tomshardware.com/reviews/radeon-r9-295x2-review-benchmark-performance,3799-14.html

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS85L1YvNDMwNjI3L29yaWdpbmFsLzExLVI5LTI5NVgyLVBvd2VyLUNvbnN1bXB0aW9uLUdhbWluZy1EZXRhaWwucG5n


aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS85L1UvNDMwNjI2L29yaWdpbmFsLzA3LUhELTc5OTAtUG93ZXItQ29uc3VtcHRpb24tR2FtaW5nLURldGFpbC5wbmc=



Below, possibly more important/relevant in the article looking at Power Consumption: General-Purpose Computing:
The 7990 has a much greater power demand increase for such work and much higher than its gaming measurement, while the 295X2 is only a bit more than its gaming power demand; looks like an important consideration if using the 7990 and taking energy cost/performance as a factor for mining and level of optimisation needed.
http://www.tomshardware.com/reviews/radeon-r9-295x2-review-benchmark-performance,3799-15.html

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS8yL00vNDMwMzY2L29yaWdpbmFsLzEyLVI5LTI5NVgyLVBvd2VyLUNvbnN1bXB0aW9uLUdQR1BVLnBuZw==



aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS8yL0gvNDMwMzYxL29yaWdpbmFsLzA4LUhELTc5OTAtUG93ZXItQ29uc3VtcHRpb24tR1BHUFUucG5n


Cheers
 
Last edited:
(except if it does crazy number like 50-60 mh/sec, but I doubt it)

I don't think Vega will do better than Fury. Probably similar.

The reason is HBM. The fact that timings make a lot of difference suggests that HBM is not so good on the latency department. Probably also the reason for GTX 1080 being slower, the GDDR5x.

If they wanted to make beast of a card, they should double up Polaris and make a 4096SP version and 512-bit wide GDDR5 bus. That would do 50MH/s out of the box if they keep the GDDR5 timings and speed similar to Polaris.

Interestingly if Intel could provide their eDRAM modules in large enough capacity and with similar latency on AMD cards that would turn out to be the absolute fastest. Can't beat 50ns latency.

AFAIK mining software uses very little RAM, so I guess they could launch mining editions with less dense VRAM chips

You can, but you can't use in-memory DAG generation software like Claymore because it takes more than 2GB VRAM. The in-memory DAG generation saves lot of time in the beginning because it takes hours.
 
HBM also scales with clocks*, Vega should be able to do 50%, edit:maybe 100%, thought about 700-ish clocks first better than Fury at least.


*To be more precise, Claymore's GPU-Miner for Ethereum scales also with HBM-clocks as it does with G5(x) memories.
 
Last edited:
HBM also scales with clocks, Vega should be able to do 50% better than Fury at least.

Would be nice... Until then, solo Fury X it is.... (I'll never be a full on miner&such I just consider it a "bonus" money when I don't use my graphic card)
 
60 mh/s ETH don't feel impresive to me @300W . Two o/c 1070's can already achieve that (2 OC 580s as well) or get close. So Vega has to do more, IMO
 
I doubt Vega will really use 300W at full load. The 300W is the theorical max imo. Plus mining only use compute stuff, not the whole gpu, so it won't consume that much anyway... My Fury X is using 180-185w when mining here... a lot less than gaming.
 
HBM also scales with clocks*, Vega should be able to do 50%, edit:maybe 100%, thought about 700-ish clocks first better than Fury at least.


*To be more precise, Claymore's GPU-Miner for Ethereum scales also with HBM-clocks as it does with G5(x) memories.

Don't forget Vega uses HBM2 and changes characteristics again. Likely for Ethereum its going to penalize it, as GDDR5x did for GDDR5. Fury is first generation HBM. If they took HBM memory to Vega-levels it would be different.
 
About that, has HBM2 better latency than HBM1 ? I don't find the answer on google :eek:

I hope Kraken will have my account verified (T1+T2) by then, it's been 3 weeks... (they announce 2-4 weaks)...
 
Don't forget Vega uses HBM2 and changes characteristics again. Likely for Ethereum its going to penalize it, as GDDR5x did for GDDR5. Fury is first generation HBM. If they took HBM memory to Vega-levels it would be different.
Does HBM gen2 change access granularity or lowers command clock but doubles transferred bits per clock?
 
Why does the article sight vram size as the issue, but then go on to say R9 290 is likely to be less effected than the RX480, while it has less or equal ram size.

Yeah it's strange. Maybe a change aim at Polaris, not vram related ? Or, just a bs article...
 
Maybe the article is actually reffering to the 4GB Rx480s & 580s ? Even they those are not the majority.. Otherwise it would have to do with the particularities of the Polaris's memory system. I can think of none compared to a 290X.

Sounds like a news/rumor that is presented but not understood by the author of that piece. But yeah, smbd should grab claymore's miner and test it out
 
The State of Mining: Guide to Ethereum

But why is mining so popular now all of a sudden? It's hard to pinpoint the exact causes, but currency value has exploded in the past year which at least explains the amount of attention Bitcoin is receiving again. After trading for about $400 a year ago, a single Bitcoin went for ~$750 by year's end, and it is currently exchanging for as much as $2,600.

The Ethereum coin has also gained tremendous popularity and is the current coin of choice for miners, having also exploded from less than $50 in mid-2016. Current prices are going for $300+.
...
First you want to make sure you can actually make a profit doing it. Otherwise you're just wasting electricity. There are plenty of great calculator tools, I personally like this one over at MyCryptoBuddy. To use the tool, you need to figure out the hashrate of your graphics card. Search Google for "*graphics card name* ethereum hashrate." For example, an RX 480 will produce 25MH/s and a GTX 1070 can produce around 35MH/s. These numbers are different for each card and can change depending on overclocks and binning.

Once you have that, the basic steps are as follows:

  • Get an Ethereum wallet
  • Download some mining software
  • Join a mining pool
  • Configure and run your miner
  • Profit?
http://www.techspot.com/article/1423-state-of-mining-ethereum/
 
You can force the benchmark to use upcoming DAGs and see for yourselves which card loses how much performance.
 
Back
Top