CryptoCurrency Mining with GPUs *spawn*



Miners with current cards would simply use an older driver.

Like pretty much everyone mining XMR is using the blockchain driver from August 2017.


Now if AMD could e.g. release a "Vega 56/64 XT" with e.g. 2.4 Gbps HBM2, bettter tuned power consumption (running this thing in anything other than power saving mode for default is pretty dumb IMHO) and a firmware that isn't compatible with the August blockchain driver, then they could indeed solve the problem.
Keep making the old 16GB FE as a card for "blockchain pioneers" and call it a day.
 
Not really sure if he was talking about a software method of disabling, but anyway.. Just posted this as is relevant to the gaming cards availability issuse which was tackled here

I definetly wouldn't want that to happen.
I wouldn't want anything, which be make things worse for consumers (gamers and miners alike), to happen actually.

I'm all for IHVs increasing their production capacity through the roof. Not that is a likely event, but hey
 
I'm all for IHVs increasing their production capacity through the roof. Not that is a likely event, but hey
What would that achieve other than providing more GPU chips for vendors to sell to miners?
 
What would that achieve other than providing more GPU chips for vendors to sell to miners?

The revenue of a miner is inverse-proportional to the number of GPUs in the market (because all coins have a fixed ammount issued per unit of time, more gpus mean less coins for each).
More GPUs will thus give everyone a chance to participate in mining, but the profits for each miner would start to decrease.

Hence it absolutely won't make sense to buy gpus for mining indefinitely.
 
The revenue of a miner is inverse-proportional to the number of GPUs in the market (because all coins have a fixed ammount issued per unit of time, more gpus mean less coins for each).
More GPUs will thus give everyone a chance to participate in mining, but the profits for each miner would start to decrease.

Hence it absolutely won't make sense to buy gpus for mining indefinitely.
Any idea what the threshold might be?
A quick googling for the global hash rate brought me to a page for Ethereum mining, which over the course of the last year grew by two orders of magnitude.
How long would AMD and Nvidia need to build extra cards, and how many factors of 10?

People are paying multiples over MSRP right now.
 
Any idea what the threshold might be?

A quick googling for the global hash rate brought me to a page for Ethereum mining, which over the course of the last year grew by two orders of magnitude.

Well as always there are many variables. But do note that the price of ETH also increased by two orders of magnitude over this one year period. So that can compensate a lot of the fact that people now mine much less ETH than they used to (with the same hardware)

I think august-november were the months in which we had some gpu availability, which even seemed to slowly raise. On my 8 Rx 580s rig, I used to earn 10$ after electricity costs each day with that rig, mining ETH. So i'd use this as threshold. Now this ring makes 22$ per day mining ETH. So roughly, we need a doubled hashrate (i.e. 400ish TH/s) in the ETH (the reasoning of course, is applicable to other coins) network to significantly reduce the amount of gpus purchased for mining. At current price (1000ish $ / 1 ETH).

We could generalise this reasoning further by factoring in what percentage of the GPUS are mining ETH as oposed to any other coin. 33% ?

Further, it is of note that the team behind the ETH coin has taken interesed on past occasions on what they called the cost for security - comparing the electricity consumed with the difficulty of compromising the network via 51% attaks. They evaluated ( I sadly wasn't able to follow their reasoning) that their network was the most expensive (or behind just BTC, can't recall exactly) . And they have taken steps to reduce this cost by artifically increasing the difficulty or decreasing the block rewards. Both measures reduced the ammount of currency issued per time unit.
If the deployment of their proof-of-stake implemntation will be delayed some more, it is conceiavable the developers will make more attempts to reduce inflation.
 
I think august-november were the months in which we had some gpu availability, which even seemed to slowly raise. On my 8 Rx 580s rig, I used to earn 10$ after electricity costs each day with that rig, mining ETH. So i'd use this as threshold. Now this ring makes 22$ per day mining ETH. So roughly, we need a doubled hashrate (i.e. 400ish TH/s) in the ETH (the reasoning of course, is applicable to other coins) network to significantly reduce the amount of gpus purchased for mining. At current price (1000ish $ / 1 ETH).

We could generalise this reasoning further by factoring in what percentage of the GPUS are mining ETH as oposed to any other coin. 33% ?
I tried to get some rough bounds for how many RX 580 cards would need to be made to get ~200TH/s.
It seems like an unoptomized 580 is ~18 MH/s, with heavy tweaking getting under 30.
A 580 chip is ~232mm2, ~17.3mmx13.3mm based on a die shot.
An online die per wafer calculator gave ~248 per 300mm wafer.
I assumed something like 85% yielded a full 580, but I do not know if that is anywhere close.

200TH/s needs between 6.7M to 11.2M RX 580s, given uncertainties of optimization and uptime.

That's somewhere between 24600 to 45000 300mm wafers.
If the idea is only a third of these cards mine ETH, that's reaching 81000 to 135000 wafers.
Globalfoundries Fab 8 has a peak rate of 60k/month. If it only made RX580s, up to a full quarter of manufacturing would be taken up before reaching this threshold.
 
Now if AMD could e.g. release a "Vega 56/64 XT" with e.g. 2.4 Gbps HBM2, bettter tuned power consumption (running this thing in anything other than power saving mode for default is pretty dumb IMHO) and a firmware that isn't compatible with the August blockchain driver, then they could indeed solve the problem.
The firmware would just be modified to restore the mining performance, using the original Vega firmware as a template.

The only time you can get away with this is before any products from a new architecture launch. Otherwise it's going to be moderately difficult at best to figure out what was changed to throttle performance and undo it.
 
Coincheck: World's biggest ever digital currency 'theft'
One of Japan's largest digital currency exchanges says it has lost some $534m (£380m) worth of virtual money in a hacking attack on its network.
Coincheck suspended deposits and withdrawals for all crypto-currencies except Bitcoin as it assessed its losses in NEM, a lesser-known coin.

If the theft is confirmed, it will be the largest involving digital currency.
http://www.bbc.com/news/world-asia-42845505
 
Welcome to the club ! I'm now using them to mine ETH actually, Cryptonight is just marginally more profitable... plus many other reasons :p

Maybe with $1000/Vega it's now a great deal for ETH too? It easily reaches 44 mh/s , with above avereage cooling (or extreme noise on the stock blowers )
 
I've considered the same, but they cannot be had for $750 here any longer. Now Vega 56/64s are going for 1200-1400 for the non-stock design. Crazy.
Meanwhile to buy the "cloud" hashrate I have in my house already (400 MH/s ETH) would cost me nearly $15000/year with a minimum 1 year buy.
You'd be lucky to make $10k in a year with that hash rate.
 
All calculators show nicehash to be most profitable though, so Cryptonight it is.

My bet is ETH will skyrocket or at least increase substantialy this year.I'm betting on it overtaking BTC's market cap as well. ETH will have Raiden, sharding and possibly some early implementation of Casper. Most of the cryptcurency infrastructure is built on ETH already. ETH can handle 3 times more transactions than BTC already..
So I'll hold my coins for a while.

Plus, my Vegas or on the main desktop I use for everything, gaming included. I can mine ETH and use the computer for everything. Mining Cryptonight while watching a movie means dropping more than 20% of the hashrate while the movie drops frames


I've considered the same, but they cannot be had for $750 here any longer. Now Vega 56/64s are going for 1200-1400 for the non-stock design. Crazy.
Meanwhile to buy the "cloud" hashrate I have in my house already (400 MH/s ETH) would cost me nearly $15000/year with a minimum 1 year buy.
You'd be lucky to make $10k in a year with that hash rate.

In my country, excluding GTS 1060 & RX560, I can only buy 1070 Ti @ 930 $. A Vega undre 1200$ is a better deal IMO. At 1400, of course not.

I never understood the economics of cloud mining. Isn't it a scam? It's ALWAYS cheaper to just buy the damn coin instead of mining it.
So you can only turn a profit maybe IF the difficulty for the target coin drops substantially AND you wait to sell it when the prices increase (if they ever...)
 
Seems coins are tanking a bit. Difficulty going up, prices going down. Probably better to trade coins than actually mine them right now, unless you already have a large rig.
 
Seems coins are tanking a bit. Difficulty going up, prices going down. Probably better to trade coins than actually mine them right now, unless you already have a large rig.

I'm not mining, I'm heating my home with my computer!
Till it's cold, I don't mind mining for little to no immediate profit.
 
So AMD has publicly stated they'd increase (or have increased to an extent) the production of GPUs and are limited by the memory supply ( https://www.anandtech.com/show/12380/amd-to-ramp-up-gpu-production-ram-a-limiting-factor )

Hopefully this at least puts to rest the "AMD is allocationg dies for Ryzens/TRs instead of GPUs" sentiment that I saw going around here.
Not really expecting inventory for gamers / hobby miners to appear anytime soon
 
Back
Top