Designing GPU based mining ICs without the GPU fluff?

Mobius1aic

Quo vadis?
Veteran
With the massive increase in mining demand and other uses, it seems that both companies should be using their IP to design "GPUs" without the fluff necessary for actual graphics computing. Obviously, any decode, media, or display silicon is unnecessary and takes up valuable die space while possibly consuming extra power. I wanted to know the feasibility and practicality of AMD and Nvidia using their GPU architectures in a stripped down manner for cryptocurrency, machine learning, etc.

I'm going to guess that both TMUs and ROPs serve no purpose in mining or machine learning. Without messing with a complete CU/SM redesign, a large array with the necessary inner comm buses and memory controllers could be a quick and feasible way to use existing architecture to make a more optimized general mining/AI processor. While a mining specific ASIC always comes along, I think crypto is here to stay, and in turn GPU demand for new cyptocurrencies will keep up the pressure unless supply catches up to demand. I understand memory supply is a major part of the issue, and honestly I'm not sure how important memory bandwidth is to mining or machine learning, so perhaps there could be some savings there too.

This just seems like one way to partition off some of the market that benefits the GPU makers while mitigating some of the effects on the market that actually needs a GPU's full capabilities.
 
Last edited:
Is it really taking up valuable space?
Considering NV put a full suite of 3D pixel rendering stuff into volta even though that chip is so clearly aimed at the not-consumer field it can't really take up all that much space in the big scheme of things...
 
Considering NV put a full suite of 3D pixel rendering stuff into volta even though that chip is so clearly aimed at the not-consumer field it can't really take up all that much space in the big scheme of things...

Well, then again, not-consumers can include people doing stuff like CAD, movie rendering, information/scientific visualisation, etc.
 
I think there's some driver effort required into making drivers for such a GPU without display support, and that IHVs would prefer avoiding that
 
Well, then again, not-consumers can include people doing stuff like CAD, movie rendering, information/scientific visualisation, etc.
In a professional setting it wouldn't be impossible to imagine a system where you have one primary graphics adapter coupled to a header-less computing accelerator.
 
In a professional setting it wouldn't be impossible to imagine a system where you have one primary graphics adapter coupled to a header-less computing accelerator.
What about a gamer setting?

apocalypse3dxbox.jpg


physx_asus_p1_box.jpg
 
What about a gamer setting?
:D :love::!:

I had one of those Apocalypse3D boards. Actually I had two of them - a PowerVR PCX1 and then later a PCX2 board. The latter I blew up by overclocking it too much, putting an oscillator that was too much for it. It died totally, irrevocably.

Anyway, that was 20-ish years ago, much has changed in the PC marketplace since then. Add-in 3D boards were the norm early on, voodoo graphics set that trend much more than powervr ever did. People today though, most of them never experienced that. Increasing number of HTPCs these days don't even have more than one PCIe slot and couldn't go dual-card setup anyway.
 
Heh. Do you remember S.T.U.N RUNNER?

I sure do!

Virtua Racing as well... Damn, those early flat-shaded polygon games looked friggin fabulous to me. You also had that Namco mecha shooting game too, what was it called? Twin joysticks for manouvering your guy, years before dual sticks became a mainstream thing on consoles.
 
Heh. Do you remember S.T.U.N RUNNER?

I sure do!

Virtua Racing as well... Damn, those early flat-shaded polygon games looked friggin fabulous to me. You also had that Namco mecha shooting game too, what was it called? Twin joysticks for manouvering your guy, years before dual sticks became a mainstream thing on consoles.
Virtua-On?
 
In a professional setting it wouldn't be impossible to imagine a system where you have one primary graphics adapter coupled to a header-less computing accelerator.

If crypto remains a thing, I sure hope that both companies start producing mining specific versions of their cards, perhaps with 2 full GPUs, full software support, etc to make them more attractive to the miners. Obviously you can't charge too much of a premium for them, otherwise they go back to regular graphics cards and we're stuck with the same problem we already have.

I guess supply really just needs to catch up, but I'm still sticking to my idea that it's worthwhile for both vendors to create mining accelerators without the "fluff". Go purely vector unit (plus the scalar component), cut the TMUs along with everything else unnecessary and get even more GFLOPS per mm².
 
Crypto-dedicated "GPUs" IMO are a very bad idea.

It will destroy the concept of decentralized proof-of-work once again, since only mining companies will be purchasing them.
At the same time they would be competing directly against Bitmain's hardware efforts, which may not be a very smart thing to do in this time and age because they're shady as hell and their range of influence is completely unpredictable right now.

In practice, they would be launching mining ASICs that are resistant to anti-centralization algorithm forks.
The result would be for cryptocoin developers to go for even more specific needs (which is getting harder), or for them to give up on fighting decentralization, which would again give all the market to much smaller and efficient fixed-function ASICs from Bitmain.


I don't see how that would be a win for AMD or nvidia. Their best action is to keep cautiously adjusting GPU production according to general demand, and to launch new graphics cards to lower the value of 2nd-hand sales of GPUs in terms of performance and efficiency.
Despite their official statements, it's probably not in AMD's or nvidia's best interests to end GPU mining, as that significantly increased the ASP of 2-year-old videocards and made them lots of money. But it's not like they want to go another full year without being able to sell graphics cards to PC gamers either.


AMD could have a very practical answer to dGPU shortage for gamers, which would be to launch a high-performance APU with HBM.
I think they probably regret having shelved that Zeppelin+Greenland APU. That thing would have probably been a Q4'17-Q3'18 bestseller with a very high ASP of 500-800€, even if it had to bundle an AiO watercooler.
 
Wasn't Greenland 16+ x86 cores? Fine for servers and scientific computing, but amateur miners would still be going after GPUs even if it existed. Plus the x86 cores would be a waste.

I agree APUs are a way to deal with miners, but you need to straddle that line of being cheap, capable and valuable enough to gamers needing both the x86 cores and compute units. But not so much skewed to the graphics side in capability that it becomes a new target for miners who might find a better value in said APU instead of a price hiked graphics card.
 
Wasn't Greenland 16+ x86 cores?
Yes, it was basically a Threadripper with a Vega 10/20 hybrid that combined 2*HBM2 stacks with 1/2 FP64 throughput.
It's obviously an overkill for gamers, but a consumer toned down version (e.g. single Zeppelin die with dual channel DDR4 + Half Vega 10 + single-stack HBM2) would have been great.

I agree APUs are a way to deal with miners, but you need to straddle that line of being cheap, capable and valuable enough to gamers needing both the x86 cores and compute units.
At the price that gaming GPU have been going for several months, a high-performance APU wouldn't need to be cheap at all.
Just look at Kaby Lake G: it's not really a Vega chip and it doesn't even perform that well but it's being touted everywhere as a great solution nonetheless.

But not so much skewed to the graphics side in capability that it becomes a new target for miners who might find a better value in said APU instead of a price hiked graphics card.
It wouldn't be attractive to miners because you could only get one in each motherboard.
And that would make it available in the shelves, which is more than what we can say of anything above the GTX 1050 Ti and RX560.
 
Back
Top