Intel Xe Architecture for dGPUs

Discussion in 'Architecture and Products' started by DavidGraham, Dec 12, 2018.

Tags:
  1. JoeJ

    Regular Newcomer

    Joined:
    Apr 1, 2018
    Messages:
    981
    Likes Received:
    1,108
    With C++ everybody can choose how much language features he wants to use, and many request OOP on GPU.
    Also many consider something like OpenCL to be too cumbersome to work with, although it's luxury in comparison to low level gfx APIs.
    Though, going from NV lock to Intel lock seems no win.
    Maybe Intel tries to be more open. The article mentions they plan a mix of C++ and CYCL (https://www.khronos.org/sycl/).
    Never tried CYCL. Could be interesting for tools development maybe? Actually i shy away from using CL myself here - afraid of code becoming too hard to maintain :|
     
  2. tuna

    Veteran

    Joined:
    Mar 10, 2002
    Messages:
    3,271
    Likes Received:
    428
    What exactly does this mean?
     
  3. tuna

    Veteran

    Joined:
    Mar 10, 2002
    Messages:
    3,271
    Likes Received:
    428
    As far as I understand, Intels compute stack will be based on open standards with multiple implementations (unlike nVidia's CUDA).
     
  4. JoeJ

    Regular Newcomer

    Joined:
    Apr 1, 2018
    Messages:
    981
    Likes Received:
    1,108
    I only mean one can use just the C subset if he wants, for example.
    (And i assume CUDA does not have much OOP features, and is similar C alike like shading languages or OpenCL 1.x)
     
  5. Rootax

    Veteran Newcomer

    Joined:
    Jan 2, 2006
    Messages:
    1,512
    Likes Received:
    873
    Location:
    France
    Cuda is too well implented right now to be moved away by something else imo.
     
    egoless and xpea like this.
  6. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    11,076
    Likes Received:
    5,626
    Nothing is too big to fail in the software world.

    Just look at flash and java.
     
    Kej, AlphaWolf, Alexko and 2 others like this.
  7. Rootax

    Veteran Newcomer

    Joined:
    Jan 2, 2006
    Messages:
    1,512
    Likes Received:
    873
    Location:
    France
    You can make the argument that java & flash are going away because they were not well taken care of.

    Of course Cuda can fail, but if nVidia doesn't let it die, I don't see big company moving from it, it's too ahead of other "language" in this sector right now. And you don't have to propose a similar alternative, but a much better one to have all the big players moving from it. Will it happen someday, sure, but my guess is it will be replace by another nVidia thing, rather than taken down by anyone else.
     
    pharma likes this.
  8. JoeJ

    Regular Newcomer

    Joined:
    Apr 1, 2018
    Messages:
    981
    Likes Received:
    1,108
    Why ahead? (Seriously asking - never used it myself. It's no option for games.)

    If Intel spans the whole field from CPU, GPU, FPGA, Tensors as said, and they do a uniform programming model well, NV might have a hard time to compete on the long run.
     
  9. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    11,076
    Likes Received:
    5,626
    The API is very complete and lets you do the same as what other open source competing APIs but much faster to implement with less code needed. Then there's a company behind who have a bunch of people whose job is to provide technical support. Then nvidia themselves provide research grants for students to work in their API.

    None of this means that Intel + ARM + Google + AMD + PowerVR + Apple + many others can't work together towards a better solution that is IHV agnostic.
     
    Rootax, pharma and JoeJ like this.
  10. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,602
    Likes Received:
    643
    Location:
    New York
    Java is failing? That’s news to me. What’s it being replaced by?

    In the Anandtech article posted above they claim Intel is investing in CUDA conversion tools because they acknowledge Nvidia’s current api advantage. I don’t know how technically feasible that is but the api itself is just one aspect.

    Nvidia has been pushing CUDA for a long time at a grass roots level and have built a very strong ecosystem of tools and frameworks covering everything from raytracing to physics and AI. Intel has a tall hill to climb.
     
  11. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    3,575
    Likes Received:
    2,294
    Also strengthened with CUDA now running on ARM processors.
    https://hothardware.com/news/nvidia-cuda-software-stack-arm-exascale-computing
     
  12. Samwell

    Newcomer

    Joined:
    Dec 23, 2011
    Messages:
    127
    Likes Received:
    154
    One of my questions here would be: Which CPUs? Will Intel really build something with good ARM support? I have my doubts. But Cuda is supporting ARM now and with Amazon, Huawei, Fujitsu etc. we also have some not so small players there. Not to forget IBM and Power.
    Of course x86 market share at the moment is overwhelming, but maybe arm can win some market share with the support it has.
     
  13. yuri

    Newcomer

    Joined:
    Jun 2, 2010
    Messages:
    211
    Likes Received:
    184
    Nowadays the whole client stack shrunk to browser (well, in fact Chrome) and its HTML5 and JS engine. It's simple. The legacy client stuff was insecure and unfriendly by design - Flash, Java Applets and Web* or Silverlight.

    The OneAPI talks reminds me the early days of AMD Fusion fantasies. Let's see.
     
  14. JoeJ

    Regular Newcomer

    Joined:
    Apr 1, 2018
    Messages:
    981
    Likes Received:
    1,108
    The day after CUDA runs on AMD :)
     
    xpea likes this.
  15. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,515
    Likes Received:
    934
  16. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    17,734
    Likes Received:
    2,213
    Location:
    Winfield, IN USA
  17. BRiT

    BRiT Verified (╯°□°)╯
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    15,907
    Likes Received:
    14,822
    Location:
    Cleveland
    People haven't learned and keep making the same mistakes. :lol:
     
  18. yuri

    Newcomer

    Joined:
    Jun 2, 2010
    Messages:
    211
    Likes Received:
    184
    "Data analysis", ML, being the 1st (introductory) lang taught at unis, etc.
     
    Kej, pharma, Frenetic Pony and 2 others like this.
  19. pcchen

    pcchen Moderator
    Moderator Veteran Subscriber

    Joined:
    Feb 6, 2002
    Messages:
    2,824
    Likes Received:
    251
    Location:
    Taiwan
    Python is quite easy to learn, and it's now the prime choice of teaching programming language in high schools.
     
  20. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,286
    Likes Received:
    3,543
    Intel shared the first ever live demo of it's dGPU called DG1, running Destiny 2 in a laptop format.

    The game appears to be running at sub 30fps, with low graphics/textures, and with horrible AA!

     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...