Intel Xe Architecture for dGPUs

Discussion in 'Architecture and Products' started by DavidGraham, Dec 12, 2018.

Tags:
  1. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,791
    Likes Received:
    2,602
    Built on 10nm and scalable for the entire market.

    [​IMG]
     
    #1 DavidGraham, Dec 12, 2018
    Last edited: Dec 12, 2018
  2. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,933
    Likes Received:
    1,629
    Your IMG doesn't display.
     
  3. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    8,485
    Likes Received:
    332
    Location:
    Treading Water
    Is that 'graph' actually supposed to be indicative of the performance of the various skus?
     
  4. giannhs

    Newcomer

    Joined:
    Sep 4, 2015
    Messages:
    37
    Likes Received:
    40
    a typical intel graph

    literally no one knows wtf they are saying
     
  5. Dayman1225

    Newcomer

    Joined:
    Sep 9, 2017
    Messages:
    57
    Likes Received:
    78
    Probably better to talk about Gen11 graphics since they actually have given details on that like Tile based rendering and what not
     
  6. Dayman1225

    Newcomer

    Joined:
    Sep 9, 2017
    Messages:
    57
    Likes Received:
    78
    Lightman likes this.
  7. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,496
    Likes Received:
    910
    Or exponential growth? As in x^e.
     
  8. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,496
    Likes Received:
    910
    Wait, no, that would be e^x, brain fart. For some reason I can't edit or delete my post, and now everyone will now how stupid I am until the end of time.
     
  9. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,496
    Likes Received:
    910
    *Know.

    Goddammit.
     
    ZoinKs!, Pete, Kej and 6 others like this.
  10. JasonLD

    Regular

    Joined:
    Apr 3, 2004
    Messages:
    379
    Likes Received:
    3
    I assume both Xeon and Xe dGPU would be based on 7nm. Since they are shooting for at least 5 times of Summit's processing power, I don't think it would be achievable on 10nm, assuming similar total power target.
     
  11. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    10,001
    Likes Received:
    4,574
    I think Intel's 10nm is closer to TSMC's 7nm than their (or Samsung's) 10nm node.
     
    sir doris likes this.
  12. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,062
    Likes Received:
    1,024
    Initial information from Intel would suggest so. It is quite likely still the case, more or less. On the other hand, it is difficult to imagine that Intel would insist on a process that just hasn't worked out for them, so it is widely rumoured and assumed that the process we will finally see widely deployed will differ from the initial plans. In what ways, and to what extent, nobody outside intel knows or shares. It will be very interesting when a third party analysis goes public, to see whatever changes had to be made (if any) to make the process viable. (But by the way their PR spins things, we could see their 7nm products before their 10nm ever gets to broad volume production. :))
     
  13. JasonLD

    Regular

    Joined:
    Apr 3, 2004
    Messages:
    379
    Likes Received:
    3
    Yeah, I know that. Since Aurora is expected to be completed in late 2021, I assumed it might be using Intel’s future 7nm Euv instead of their upcoming 10nm. Intel’s 7nm should be closer to TSMC’s and Samsung’s future 3nm.
     
  14. Dayman1225

    Newcomer

    Joined:
    Sep 9, 2017
    Messages:
    57
    Likes Received:
    78
    To be fair it’s being installed in 2021 so unless Intel is ramping new 7nm chips in late 2020 it’ll likely be 10nm based stuff. According to Toms it’s being “stood up” in early 2021 and will be fully operational by the end of 2021
     
  15. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,791
    Likes Received:
    2,602
    OCASM, egoless, iroboto and 1 other person like this.
  16. iroboto

    iroboto Daft Funk
    Legend Regular Subscriber

    Joined:
    Mar 6, 2014
    Messages:
    7,906
    Likes Received:
    6,192
    OCASM, egoless and BRiT like this.
  17. Ike Turner

    Veteran Regular

    Joined:
    Jul 30, 2005
    Messages:
    1,884
    Likes Received:
    1,759
    This is some nice & tasty nothing burger...

    GPU acceleration for Intel Embree & OSPRay (currently CPU only via AVX2 /SSE2) most probably via OpenCL or Vulkan.

    The exact Intel quote

    "Xe architecture roadmap for data center optimized rendering includes ray tracing hardware acceleration support for the Intel® Rendering Framework family of API’s and libraries."

    Fanboy clickbait interpretation:

    "Intel's next dGPU to support hardware ray tracing."

    EDIT: Here's the full PR and it doesn't sound like anything more that Intel porting Embree & OSPRay to GPU and that they are treating RT as "as a general computational technique" (similar wording to Microsoft's DXR announcement btw). I don't see anything hinting at some "RT Core" like implementation...

     
    #17 Ike Turner, May 1, 2019
    Last edited: May 2, 2019
    egoless, Lightman, BRiT and 2 others like this.
  18. Malo

    Malo Yak Mechanicum
    Legend Veteran Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    7,042
    Likes Received:
    3,114
    Location:
    Pennsylvania
    Yeah that's basically saying the GPUs will be supported for the Intel framework API.

    Which WCCF specializes in and yet they're still spread everywhere.
     
    egoless likes this.
  19. OCASM

    Regular Newcomer

    Joined:
    Nov 12, 2016
    Messages:
    922
    Likes Received:
    881
    Well, NVIDIA mentioned that the Turing RT cores could also be used for physics and audio simulations so it can be interpreted either way :twisted:
     
  20. keldor

    Newcomer

    Joined:
    Dec 22, 2011
    Messages:
    74
    Likes Received:
    107
    All I see is a bunch of PR drivel. And comparing their product due sometime in the next few years with the programming model of GPUs about a decade ago. I'm not impressed.
     
    A1xLLcqAgt0qc2RyMz0y and pharma like this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...