Intel Xe Architecture for dGPUs

Discussion in 'Architecture and Products' started by DavidGraham, Dec 12, 2018.

Tags:
  1. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    15,224
    Likes Received:
    2,496
    Is that the diplomatic way of saying they lied?
     
    Lightman and xpea like this.
  2. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,507
    Likes Received:
    928
    Pretty much.
     
    Cuthalu likes this.
  3. yuri

    Newcomer

    Joined:
    Jun 2, 2010
    Messages:
    190
    Likes Received:
    163
    Raja was the lead of Radeon Technologies Group during that period. So he was fully responsible for that situation.
     
  4. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,994
    Likes Received:
    2,935
    Anyway, It's nice to see such a a major shift in the industry, a few years ago some people were arguing that dGPUs are going to die, now we have the largest semiconductor company of the world (Intel) making huge dGPUs and betting on them big to drive their future growth. To the point that it's downplaying CPUs in favor of GPUs and FPGAs. We came full circle!

    https://wccftech.com/intel-ceo-beyond-cpu-7nm-more/
     
    digitalwanderer, Lightman and pharma like this.
  5. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,507
    Likes Received:
    928
    Ultimately, yes, but what I mean is that I don't know whether he was responsible for the decision to put so much pressure on engineering teams, whether he knew said engineers weren't telling the truth and just ran with it or whether he was fooled, whether he tried to set more realistic targets and/or budgets and was overruled, and if so, how much he fought, etc. I only have one person's perspective on what happened, and I don't know what role he played exactly—or at all, actually.
     
    digitalwanderer and Leovinus like this.
  6. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    17,396
    Likes Received:
    1,863
    Location:
    Winfield, IN USA
    There was a LOT of infighting at AMD during that time, much of it between the CPU division and RTG. Ryzen's success was a huge force internally for politicizing and I think Threadripper threw everyone for a loop even at AMD and they weren't sure how to deal with it for a while.

    I haven't heard all the details or any from Raja, but I know RTG was being controlled more by the bean counters than Raja after a while and I'm pretty sure that had a large part to do with the whole RTG exodus.

    It's not all on Raja, he had bosses too. I had a smoke and a good conversation with one at Siggraph in 2016, (I think, a year +/- because I'm foggy), and he seemed really cool; but then again I didn't work for him.

    RTG wasn't independent of AMD, it wasn't like Raja had total say. AMD held the purse strings and could, (and did), steal talent from his group at their discretion. (All my own conjecture based on a myriad of things)

    He had a ton of responsibility, but no where near the authority to carry it out. He was fighting with both arms tied behind his back at some points. I don't think it ended very pretty. :(

    That being said I haven't kept up on where the Intel GPU is at except for the hirings and firings there, and those have been pretty damned interesting especially of late! AdoredTV released a new video today that I still haven't fully digested, but it's looking like we're in for a wild ride! I mean seriously, don't the terms "real world vs synthetic" benchmarks just take you back to the days of FX? I get warm fuzzies. :D
     
    Leovinus and Malo like this.
  7. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,098
    Likes Received:
    1,064
    What bothers me a bit with posts like this and the the ”gaming tech web” in general is the idea that Intel will be targeting gaming with their dGPUs, whereas everything I’ve heard out of them speaks of general compute.

    I can’t for the life of me believe that Intel could make a competitive gaming GPU, at least not without subsidising them heavily. And their $10billion mobile debacle should have made them a bit cautious when it comes to buying their way into markets. And the gaming dGPU market is not exactly promising huge revenue for the future, at least it made some sense to buy their way into mobile.

    No, Intel needs graphics for their CPUs, and they need stronger parallel compute for certain server tasks and they can combine these to some extent. But going after gamers? Nah. I haven’t heard anything in their PR that say that this is their goal. Which makes perfect sense. That impression is created elsewhere.
     
    #47 Entropy, Dec 8, 2019
    Last edited: Dec 8, 2019
  8. PSman1700

    Veteran Newcomer

    Joined:
    Mar 22, 2019
    Messages:
    1,128
    Likes Received:
    313
    Even worse for consoles there as margins are even lower there.
     
  9. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,098
    Likes Received:
    1,064
    Of course.
    Gaming revenue expansion is in mobile, not PC or console where revenue stays largely constant while numbers are dropping (less people pay more essentially).

    If you are going to invest to break into a market it had better be an expanding one, otherwise the best you can hope for is a slice of the pie with depressed margins all around. (You want a market where you can help shape future development and directions, thus steering the future revenue your way more exclusively, improving margins.)

    Investing in data center computation makes sense for intel. Creating dGPUs for gamers to play RedDeadRedemption a bit cheaper doesn’t.
     
    #49 Entropy, Dec 8, 2019
    Last edited: Dec 8, 2019
  10. Rootax

    Veteran Newcomer

    Joined:
    Jan 2, 2006
    Messages:
    1,287
    Likes Received:
    679
    Location:
    France
    I think going after gamers would be the cherry on the cake, only if their architecture could do well in games too, then why not go there...
    It's a big if. But it can be a way to fight against AMD on another front.
     
  11. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    2,994
    Likes Received:
    2,935
    Their PR made it clear it's not their immediate top priority goal, but they frankly stated that dGPUs for gamers are coming.
     
    Cuthalu and PSman1700 like this.
  12. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    3,082
    Likes Received:
    1,755
    This really doesn't look like a GPU destined for the data center.
    [​IMG]
     
    PSman1700 likes this.
  13. Leovinus

    Newcomer

    Joined:
    May 31, 2019
    Messages:
    81
    Likes Received:
    31
    Location:
    Sweden
    I believe that's the recipe for burning people out. And that situation is sort of what I gleaned during and after Vega's launch. Which is why I give Raja the benefit of the doubt generally. Anyone in those positions to be honest. I just hope he finds the work on Xe fulfilling and restorative.

    Moving to the present. I haven't had time to check the AdoredTV video yet. Has anyone gleaned what the rumoured reasons behind the rumoured performance deficits are?
     
    digitalwanderer likes this.
  14. Rootax

    Veteran Newcomer

    Joined:
    Jan 2, 2006
    Messages:
    1,287
    Likes Received:
    679
    Location:
    France

    What, you don't want rgb illuminated datacenter ?!
     
    Lightman, PSman1700 and pharma like this.
  15. BRiT

    BRiT (╯°□°)╯
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    13,330
    Likes Received:
    10,074
    Location:
    Cleveland
    Just dont plug in any speakers to those data center servers. You'd be amazed at what sometimes makes it into production code, like playing audio file every time an exception is thrown.
     
  16. JoeJ

    Regular Newcomer

    Joined:
    Apr 1, 2018
    Messages:
    711
    Likes Received:
    802
    First my BIOS is doing seems running lightweight libJpg to display logo with shiny compression artifacts :O

    I think all those designs were fanart.
     
  17. JoeJ

    Regular Newcomer

    Joined:
    Apr 1, 2018
    Messages:
    711
    Likes Received:
    802
    digitalwanderer likes this.
  18. Leovinus

    Newcomer

    Joined:
    May 31, 2019
    Messages:
    81
    Likes Received:
    31
    Location:
    Sweden
    I read that as mening that the architecture is capable of running x86 code, not that it is essentially an x86 design. So far as I know most (all?) x86 CPU's these days are more akin to RISC cores with an x86 translation layer on top of it anyway though (an extremely simplified explanation...). So for Intel to develop an architecture, and a compatibility layer in hardware or software, to maximise execution efficiency while still remaining, in essence, a traditional GPU wouldn't be entirely beyond the realm of feasibility. But it sounds dubious to me to let x86 compatibility in any way shape GPU design. At least if it in any way threatened to introduced inefficiencies within the GPU realm of things.
     
    digitalwanderer likes this.
  19. JoeJ

    Regular Newcomer

    Joined:
    Apr 1, 2018
    Messages:
    711
    Likes Received:
    802
    Yeah, translation layer in every core does sound inefficient and a waste. It already did for Larrabee. What's the point? Need to rewrite code for GPUs anyway, x86 won't help here.
    But may be a marketing stunt to compete CUDA, and i hope this does not end up in gaming GPUs - if ture at all.
     
    digitalwanderer likes this.
  20. PSman1700

    Veteran Newcomer

    Joined:
    Mar 22, 2019
    Messages:
    1,128
    Likes Received:
    313
    With the PC gaming market only growing, there is no reason to delcare dgpu's dead. Just wet dreams of some.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...