Designing GPU based mining ICs without the GPU fluff?

Discussion in 'Architecture and Products' started by Mobius1aic, Apr 29, 2018.

  1. Mobius1aic

    Mobius1aic Quo vadis?
    Veteran

    Joined:
    Oct 30, 2007
    Messages:
    1,649
    Likes Received:
    244
    With the massive increase in mining demand and other uses, it seems that both companies should be using their IP to design "GPUs" without the fluff necessary for actual graphics computing. Obviously, any decode, media, or display silicon is unnecessary and takes up valuable die space while possibly consuming extra power. I wanted to know the feasibility and practicality of AMD and Nvidia using their GPU architectures in a stripped down manner for cryptocurrency, machine learning, etc.

    I'm going to guess that both TMUs and ROPs serve no purpose in mining or machine learning. Without messing with a complete CU/SM redesign, a large array with the necessary inner comm buses and memory controllers could be a quick and feasible way to use existing architecture to make a more optimized general mining/AI processor. While a mining specific ASIC always comes along, I think crypto is here to stay, and in turn GPU demand for new cyptocurrencies will keep up the pressure unless supply catches up to demand. I understand memory supply is a major part of the issue, and honestly I'm not sure how important memory bandwidth is to mining or machine learning, so perhaps there could be some savings there too.

    This just seems like one way to partition off some of the market that benefits the GPU makers while mitigating some of the effects on the market that actually needs a GPU's full capabilities.
     
    #1 Mobius1aic, Apr 29, 2018
    Last edited: Apr 29, 2018
  2. BRiT

    BRiT (╯°□°)╯
    Moderator Legend Alpha Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    12,380
    Likes Received:
    8,598
    Location:
    Cleveland
    Is it really taking up valuable space? Seems like those blocks are trivial in terms of die-space and won't take up power when not in use.
     
    pharma, RecessionCone and Grall like this.
  3. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,172
    Location:
    La-la land
    Considering NV put a full suite of 3D pixel rendering stuff into volta even though that chip is so clearly aimed at the not-consumer field it can't really take up all that much space in the big scheme of things...
     
  4. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,491
    Likes Received:
    908
    Well, then again, not-consumers can include people doing stuff like CAD, movie rendering, information/scientific visualisation, etc.
     
  5. entity279

    Veteran Regular Subscriber

    Joined:
    May 12, 2008
    Messages:
    1,229
    Likes Received:
    422
    Location:
    Romania
    I think there's some driver effort required into making drivers for such a GPU without display support, and that IHVs would prefer avoiding that
     
    ImSpartacus likes this.
  6. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,172
    Location:
    La-la land
    In a professional setting it wouldn't be impossible to imagine a system where you have one primary graphics adapter coupled to a header-less computing accelerator.
     
    ImSpartacus likes this.
  7. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,828
    Likes Received:
    4,450
    What about a gamer setting?

    [​IMG]

    [​IMG]
     
    sonen, Kej, milk and 2 others like this.
  8. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,172
    Location:
    La-la land
    :D <3:!:

    I had one of those Apocalypse3D boards. Actually I had two of them - a PowerVR PCX1 and then later a PCX2 board. The latter I blew up by overclocking it too much, putting an oscillator that was too much for it. It died totally, irrevocably.

    Anyway, that was 20-ish years ago, much has changed in the PC marketplace since then. Add-in 3D boards were the norm early on, voodoo graphics set that trend much more than powervr ever did. People today though, most of them never experienced that. Increasing number of HTPCs these days don't even have more than one PCIe slot and couldn't go dual-card setup anyway.
     
  9. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,491
    Likes Received:
    908
    It's fun to think that there was a time when "arcade-level 3D performance" was actually a good thing. :)
     
    Kej, CarstenS, milk and 2 others like this.
  10. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,172
    Location:
    La-la land
    Heh. Do you remember S.T.U.N RUNNER?

    I sure do!

    Virtua Racing as well... Damn, those early flat-shaded polygon games looked friggin fabulous to me. You also had that Namco mecha shooting game too, what was it called? Twin joysticks for manouvering your guy, years before dual sticks became a mainstream thing on consoles.
     
    pharma likes this.
  11. Ryan Smith

    Regular

    Joined:
    Mar 26, 2010
    Messages:
    609
    Likes Received:
    1,036
    Location:
    PCIe x16_1
    Virtua-On?
     
    Grall likes this.
  12. Mobius1aic

    Mobius1aic Quo vadis?
    Veteran

    Joined:
    Oct 30, 2007
    Messages:
    1,649
    Likes Received:
    244
    If crypto remains a thing, I sure hope that both companies start producing mining specific versions of their cards, perhaps with 2 full GPUs, full software support, etc to make them more attractive to the miners. Obviously you can't charge too much of a premium for them, otherwise they go back to regular graphics cards and we're stuck with the same problem we already have.

    I guess supply really just needs to catch up, but I'm still sticking to my idea that it's worthwhile for both vendors to create mining accelerators without the "fluff". Go purely vector unit (plus the scalar component), cut the TMUs along with everything else unnecessary and get even more GFLOPS per mm².
     
    milk likes this.
  13. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,828
    Likes Received:
    4,450
    Crypto-dedicated "GPUs" IMO are a very bad idea.

    It will destroy the concept of decentralized proof-of-work once again, since only mining companies will be purchasing them.
    At the same time they would be competing directly against Bitmain's hardware efforts, which may not be a very smart thing to do in this time and age because they're shady as hell and their range of influence is completely unpredictable right now.

    In practice, they would be launching mining ASICs that are resistant to anti-centralization algorithm forks.
    The result would be for cryptocoin developers to go for even more specific needs (which is getting harder), or for them to give up on fighting decentralization, which would again give all the market to much smaller and efficient fixed-function ASICs from Bitmain.


    I don't see how that would be a win for AMD or nvidia. Their best action is to keep cautiously adjusting GPU production according to general demand, and to launch new graphics cards to lower the value of 2nd-hand sales of GPUs in terms of performance and efficiency.
    Despite their official statements, it's probably not in AMD's or nvidia's best interests to end GPU mining, as that significantly increased the ASP of 2-year-old videocards and made them lots of money. But it's not like they want to go another full year without being able to sell graphics cards to PC gamers either.


    AMD could have a very practical answer to dGPU shortage for gamers, which would be to launch a high-performance APU with HBM.
    I think they probably regret having shelved that Zeppelin+Greenland APU. That thing would have probably been a Q4'17-Q3'18 bestseller with a very high ASP of 500-800€, even if it had to bundle an AiO watercooler.
     
    Lightman likes this.
  14. Mobius1aic

    Mobius1aic Quo vadis?
    Veteran

    Joined:
    Oct 30, 2007
    Messages:
    1,649
    Likes Received:
    244
    Wasn't Greenland 16+ x86 cores? Fine for servers and scientific computing, but amateur miners would still be going after GPUs even if it existed. Plus the x86 cores would be a waste.

    I agree APUs are a way to deal with miners, but you need to straddle that line of being cheap, capable and valuable enough to gamers needing both the x86 cores and compute units. But not so much skewed to the graphics side in capability that it becomes a new target for miners who might find a better value in said APU instead of a price hiked graphics card.
     
  15. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,828
    Likes Received:
    4,450
    Yes, it was basically a Threadripper with a Vega 10/20 hybrid that combined 2*HBM2 stacks with 1/2 FP64 throughput.
    It's obviously an overkill for gamers, but a consumer toned down version (e.g. single Zeppelin die with dual channel DDR4 + Half Vega 10 + single-stack HBM2) would have been great.

    At the price that gaming GPU have been going for several months, a high-performance APU wouldn't need to be cheap at all.
    Just look at Kaby Lake G: it's not really a Vega chip and it doesn't even perform that well but it's being touted everywhere as a great solution nonetheless.

    It wouldn't be attractive to miners because you could only get one in each motherboard.
    And that would make it available in the shelves, which is more than what we can say of anything above the GTX 1050 Ti and RX560.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...