CryptoCurrency Mining with GPUs *spawn*

show me where


What you mine today *may* be worth quite a bit less tomorrow

Did you think I made it up?

Whattomine

And yes, what you mine today could be worth less at some future time. Kind of redundant to say it, though, when the original statement was expressing uncertainty. It *may* also stay roughly the same. Now all bases are covered.
 
Donno I see $2.37 there, not $3. Also are you paying 10c per KWhr thats cheaper than 95+% of american's.
I was looking at the above site
https://www.cryptocompare.com/mining/calculator/eth?HashingPower=40&HashingUnit=MH/s&PowerConsumption=230&CostPerkWh=0.1&MiningPoolFee=1
which seems to give a different answer based on the same inputs, so who's correct?

The profitability numbers changed from when I looked at them earlier.

Whattomine is the site most often referenced on the various forums and reddits I've been on. It's fairly comprehensive in the coins it tracks, you can see exactly where they derive their numbers from (which exchange's price they are basing their $ conversion on, etc.) and it shows how the various factors that determine profitability are trending. This is useful since you might not want to point your miners at a coin that became super-profitable if the only reason it became profitable is the network hashrate just dropped 70%.
 
Finally managed to get my VEGA Frontier Edition chugging along at 2050H/s at 1408 MHz (890mV) core and 1100 MHz HBM (870mV).

Long story short, download DDU, Adrenalin and Blockchain driver.
1st run DDU which removes all graphic card drivers. It should automatically reboot.
2nd install Adrenalin driver and reboot.
3rd run regedit and search for “CrossFire”. Change value to “0” for CrossFireLink and Ulps. Reboot and verify changes.
4th open Device Manager and update drivers on the VEGA FE with Blockchain driver. Verify in Device Manager that it says 8.8. 2017 as driver.
5th disable graphic card and re-enable.
6th run OverDriveNTool and apply your settings.
7th run miner.

Rinse and repeat at EVERY restart :p
 
Finally managed to get my VEGA Frontier Edition chugging along at 2050H/s at 1408 MHz (890mV) core and 1100 MHz HBM (870mV).

Long story short, download DDU, Adrenalin and Blockchain driver.
1st run DDU which removes all graphic card drivers. It should automatically reboot.
2nd install Adrenalin driver and reboot.
3rd run regedit and search for “CrossFire”. Change value to “0” for CrossFireLink and Ulps. Reboot and verify changes.
4th open Device Manager and update drivers on the VEGA FE with Blockchain driver. Verify in Device Manager that it says 8.8. 2017 as driver.
5th disable graphic card and re-enable.
6th run OverDriveNTool and apply your settings.
7th run miner.

Rinse and repeat at EVERY restart :p

You should not have to do all those steps at each reboot just steps 5 through 7.
 
The profitability numbers changed from when I looked at them earlier.
Fair enuf, I realize its volatile
One thing to keep in mind though is with that page it only tells you how much the GPU's consume

I just had a google for what the system consumes
https://bitcointalk.org/index.php?topic=1851579.0
I have a 6 GPU rx 470 4GB rig that pulls ~907-910 watts at the wall at full stock settings
whattomine.com only considers power usgae of 600watts

So, for 6x rx 480 8gb without overclock or undervolt, running windows 10, mining ETH are having a pick of around 1225 watts and an average of 1150 watts
whattomine.com only considers power usgae of 720watts

Thus actual Power usage in reality is a fair bit higher than what whattomine.com reports. Also theres also the fee one must pay from converting your cryptocurrency into dollars.
I just can't ee why anyone(*) bothers unless they go fullout with 1000s of GPU's

(*)Exceptions for ppl in 3rd world countries or I suppose children also, who could use it as some pocketmoney.
 
Fair enuf, I realize its volatile
One thing to keep in mind though is with that page it only tells you how much the GPU's consume

I just had a google for what the system consumes
https://bitcointalk.org/index.php?topic=1851579.0

whattomine.com only considers power usgae of 600watts


whattomine.com only considers power usgae of 720watts

Thus actual Power usage in reality is a fair bit higher than what whattomine.com reports. Also theres also the fee one must pay from converting your cryptocurrency into dollars.
I just can't ee why anyone(*) bothers unless they go fullout with 1000s of GPU's

(*)Exceptions for ppl in 3rd world countries or I suppose children also, who could use it as some pocketmoney.

Personally, I do this because it's free money from hardware I already own (a sunk cost) using power I'd be using anyway to heat the room the PC is in. I've already mostly funded an entire second PC via cryptocurrency I've mined (paid like $200 out of pocket) and have since accumulated enough to be 1/2 way to a 1080ti (if they ever get back to MSRP) or whatever makes the most sense at the time I accumulate enough to actually purchase something. If values go down from here, this takes longer. If they go up it happens sooner. Either way it's a free upgrade.
 
You should not have to do all those steps at each reboot just steps 5 through 7.

Unfortunately the current beta Blockchain drivers does not correctly identify the FE after it has been disabled. So you need this workaround to enable the compute “switch”.

Otherwise you are stuck at 1520 H/s.
 
and 5, 6, 7 could be executed by a single script

Yeah, could run JJs script but some times the undervolt values doesn’t stick.

Fair enuf, I realize its volatile
One thing to keep in mind though is with that page it only tells you how much the GPU's consume

I just had a google for what the system consumes
https://bitcointalk.org/index.php?topic=1851579.0

whattomine.com only considers power usgae of 600watts


whattomine.com only considers power usgae of 720watts

Thus actual Power usage in reality is a fair bit higher than what whattomine.com reports. Also theres also the fee one must pay from converting your cryptocurrency into dollars.
I just can't ee why anyone(*) bothers unless they go fullout with 1000s of GPU's

(*)Exceptions for ppl in 3rd world countries or I suppose children also, who could use it as some pocketmoney.

If you hover the mouse over the GPU selection button the tooltip tells you the settings needed to obtain those measurements.

It also goes the other way, my VEGA FE under load only consumes 150 Watt more than idle but is listed as 200 Watt but only 1850 H/s (VEGA 64).

Perhaps difference in ASIC quality and binning.
 
Last edited:
Yeah, could run JJs script but some times the undervolt values doesn’t stick.



If you hover the mouse over the GPU selection button the tooltip tells you the settings needed to obtain those measurements.

It also goes the other way, my VEGA FE under load only consumes 150 Watt more than idle but is listed as 230 Watt (VEGA 64).

Perhaps difference in ASIC quality and binning.

I found whattomine numbers to be conservative wrt hashrates (almost always managed to 'beat' those numbers) . On the other hand, I usually draw more power doing so. Perhaps @ocminer doesn't measure power from the wall like I do or he has the pacience (and luck !) to reach really low voltages, which I haven't bothered all that much with.

But in whattomine you can always imput your own hashrates and your own power draws, so just use that for better estimates, ppl
 
New Finally managed to get my VEGA Frontier Edition chugging along at 2050H/s at 1408 MHz (890mV) core and 1100 MHz HBM (870mV).
What fan speed are you running at, to keep the 1100MHz HBM below 80ºC?
 
Can the Unified XMR-Stak make use of both the CPU cores and the built-in Vega IGP?

Yes, but the low amount of L3 cache for the CPU cores and low memory bandwidth available for the iGPU doesn't make it a stellar miner:

http://www.legitreviews.com/amd-ryzen-5-2400g-mining-performance-nicehash-xmr-stak_202662

270 H/s (90H/s per cpu + 90H/s GPU) at 68W. Any Ryzen with 16MB L3 will get considerably better results. Even the Ryzen 5 1500X can probably do over 400 H/s by using 4MB L3 per core.
 
Yes, but the low amount of L3 cache for the CPU cores and low memory bandwidth available for the iGPU doesn't make it a stellar miner:

http://www.legitreviews.com/amd-ryzen-5-2400g-mining-performance-nicehash-xmr-stak_202662

270 H/s (90H/s per cpu + 90H/s GPU) at 68W.

I wonder if this 50% overclock on the Vega 8 along with using the Blockchain driver would get much better numbers. I looks like the above review did not use the Blockchain driver.

https://wccftech.com/amd-ryzen-3-2200g-vega-8-overclocked-1600mhz-performance

As you noted with only 4MB L3 cache two threads is all you can run so the Ryzen 3 2200g at $99 would be a better than the 2400g.
 
What fan speed are you running at, to keep the 1100MHz HBM below 80ºC?

It only runs at ~2600 RPM. Average temperature is ~67ºC, so could go even lower. 1408:890mV core and 1100:870mV HBM. Haven't tested lower voltages yet.

It's an old repurposed Mac Pro so it has great airflow (2 front fans at 600 RPM) and power readings directly from the PSU. If anything I should remove one of the Xeon processors, as I don't do any mining on them. Not worth the extra power, they can pull around 180 H/s.
 
Last edited:
RX550 at 10.4 MH/s (pc)
RX580 at 25.1 MH/s (asus laptop)

dunno those are fast or slow or normal.

But on the laptop, it made the fan spins super duper loud, and the temp stays around 74-80c
 
Back
Top