Intel ARC GPUs, Xe Architecture for dGPUs [2018-2022]

Status
Not open for further replies.
Any educated guesses as to how far we are from Intel releasing actual desktop gaming dGPUs? Six months, a year, two years?
 
Any educated guesses as to how far we are from Intel releasing actual desktop gaming dGPUs? Six months, a year, two years?
2-3 years at the minimum if they don't give up in the mean time. They'd be launching real mobile descrete GPUs before desktop. It looks like DG1 is a complete non product even for that. So we will be wait for DG2 for a real mobile discrete GPU at this point and that will likely be mobile only because desktop performance won't be anywhere close to useful. Maybe by DG3 they'd have something stable enough and powerful enough that release on desktop makes sense.
 
Any educated guesses as to how far we are from Intel releasing actual desktop gaming dGPUs? Six months, a year, two years?
Under 12 months, they've appeared in the drivers already. The bigger question is how competitive they are. Also they're Xe-HP, not Xe-LP like this DG1 is.
 
Seems likely the architecture has been in the works longer than we think then. Like they started on it around 2014.

I wouldn't be surprised if they have had various discrete projects over the years that just weren't productized.
 
Last edited:
Seems likely the architecture has been in the works longer than we think then. Like they started on it around 2014.

I wouldn't be surprised if they have had various discrete projects over the years that just weren't productized.
TBH I find it unlikely that they'd have discrete products in development before. And certainly no GPU architecture takes 5 years to develop, 3 is closer to reality and with the manpower of Intel and by using GenXX as a base, it's entirely possible that the Gen12/Xe development started under Koduris watch
 
TBH I find it unlikely that they'd have discrete products in development before. And certainly no GPU architecture takes 5 years to develop, 3 is closer to reality and with the manpower of Intel and by using GenXX as a base, it's entirely possible that the Gen12/Xe development started under Koduris watch

Intel i740 says hello. :)

Yes, I know you likely meant "in recent years", but I couldn't help it. :)

Regards,
SB
 
It says Intel in big letters on it. :D Next thing you know, you consider Xe an AMD product because of Raja …
Only if you're suggesting Raja brought AMD design with him to use as the base ;-)
Being serious though, i740 was joint effort by Intel, Real3D and Chips and Technologies and it was built on Real3Ds tech
 
This is slide was thought to be nothing more than a fake, and an April's fool joke.

intel-xe-page-002-1030x579.jpg


However, according to Jim from AdoredTV, it is NOT! Intel is really building it's GPUs this way, multiple dies connected together, the exponential "e" refers to the number of dies within a GPU.


He then proceeds to explain how "According to his sources" the entire GPU project at Intel is a complete mess and a "clown show", and gives context for the recent departure of high level officials from the GPU division, I normally don't pay much attention to Jim, but these departure information really give some credit to his information.
 
This is slide was thought to be nothing more than a fake, and an April's fool joke.

intel-xe-page-002-1030x579.jpg


However, according to Jim from AdoredTV, it is NOT! Intel is really building it's GPUs this way, multiple dies connected together, the exponential "e" refers to the number of dies within a GPU.


He then proceeds to explain how "According to his sources" the entire GPU project at Intel is a complete mess and a "clown show", and gives context for the recent departure of high level officials from the GPU division, I normally don't pay much attention to Jim, but these departure information really give some credit to his information.
Im expecting terrible GPUs if anything is released at all.
 
I don't expect first gen Xe to be more efficient than the other two, but if Intel manages to compete in price/performance then it's already quite good.
 
I don't expect the first dGPU to be on par with competition either. Maybe 2nd or 3rd gen then...

The issue i care about is RT yes or probably no. If no, this could delay the desired 'RT on minimal specs' once more.
 
I'm pretty sure this can be good for some compute things, but yeah giving all that we are seeing, I've doubt about gaming. Which is ...sad ? I mean the igpu were not very powerfull, but they were pretty decent for the size, and drivers were better and better. Maybe they should have expend on that rather than what they're doing right now... As I say that, I know it's too soon the judge, we'll see...
 
I think AdoredTV was quite clear in the fact that while DG1 is definitely a compromise product. DG2 would be the one that Raja actually architected from the ground up. With that in mind I don't understand the rush for Intel to release a card at all; if all they had to offer was a horrible mishmash of features stapled onto a terribly maladapted architecture (see below). If DG2 is a clean slate product especially. What gain is there? A bookkeeping exercise so as not having to book the sunk cost as a total loss?

More bewildering still is that - as I understood it - Intel has the Gen 11 graphics components from their iGPU designs available, but that DG1 is actually based on an entirely different architecture that was originally aimed towards server side streaming demands. How interrelated these architectures are was never mentioned, if they are at all. But it would be interesting to know.
 
Probably, which is likely the reason Intel still releases "new" CPUs every year.

If that is truly the reason, it's the shittiest way of running a hardware business I've ever heard of. Though thinking about how sensitive these companies are to investor and analyst scrutiny I'm sure it's not unique or even rare. Sadly.
 
That's Arctic Sound.
DG1 is TGL iGPU config on a card with discrete memory.

AdoredTV is clear that Arctic Sound was morphed into Xe. Which is the thing that has me entirely confused since, you know, they have a perfectly acceptable iGPU design in Gen11. I feel as though I've been led astray somewhere here with regards to timeline and architecture transmutations.

Was Arctic Sound initially tinkered with by Raja (as alleged by AdoredTV at around the 13:20 mark) in an attempt to make it marketable as a GPGPU for Xe, only to drop Arctic Sond for the Gen11 iGPU in a refocusing effort to push something out the door? For what gain? If DG2 is a clean slate design, surely Intel isn't making any favours for itself by releasing a bastard step-child with little to no connection to it's proper sibling.

Granted, they're still making inroads with developers and getting driver teams up and running and shorting up the iGPU support etc. So the effort isn't completely wasted. But as a way of marketing their entry into the discrete graphics game it's frankly bewildering.
 
Status
Not open for further replies.
Back
Top