Intel ARC GPUs, Xe Architecture for dGPUs [2018-2022]

Status
Not open for further replies.
AdoredTV is clear that Arctic Sound was morphed into Xe. Which is the thing that has me entirely confused since, you know, they have a perfectly acceptable iGPU design in Gen11. I feel as though I've been led astray somewhere here with regards to timeline and architecture transmutations.

Was Arctic Sound initially tinkered with by Raja (as alleged by AdoredTV at around the 13:20 mark) in an attempt to make it marketable as a GPGPU for Xe, only to drop Arctic Sond for the Gen11 iGPU in a refocusing effort to push something out the door? For what gain? If DG2 is a clean slate design, surely Intel isn't making any favours for itself by releasing a bastard step-child with little to no connection to it's proper sibling.

Granted, they're still making inroads with developers and getting driver teams up and running and shorting up the iGPU support etc. So the effort isn't completely wasted. But as a way of marketing their entry into the discrete graphics game it's frankly bewildering.
Intel has specifically said that Xe includes parts from Gen-architectures
 
I think AdoredTV was quite clear in the fact that while DG1 is definitely a compromise product. DG2 would be the one that Raja actually architected from the ground up. With that in mind I don't understand the rush for Intel to release a card at all

Intel has been hyping that they will have a discrete gpu card in 2020 like forever for them not to release anything will clearly show that all that hype was just that hype.
 
Even his „leaks“ related to Ryzen 3000 specifications were completely off. I have no idea, how he got so large audience.
 
Even his „leaks“ related to Ryzen 3000 specifications were completely off. I have no idea, how he got so large audience.

Uh ? The specs were right but the prices were off, no ? And he was one of the first talking about chiplets if I'm correct.

I believe he has sources, but sometimes he overinterpret things.
 
Uh ? The specs were right but the prices were off, no ? And he was one of the first talking about chiplets if I'm correct.

I believe he has sources, but sometimes he overinterpret things.

Or he put too much trust on these new sources who told him about Navi, bringing his credibility down.
Regardless, at the moment there doesn't seem to be a lot of people still giving him much importance.
 
About Navi, I didn't watch all the videos, but I remember him saying (well, amd sources were saying) that it was a too hot and too power angry, so they wouldn't be able to compete at high end again, which was true in the end.
 
About Navi, I didn't watch all the videos, but I remember him saying (well, amd sources were saying) that it was a too hot and too power angry, so they wouldn't be able to compete at high end again, which was true in the end.
Yes, but the list with specs, prices and names he showed over and over again in late 2018 / early 2019 were completely off.

Then after being completely proven wrong, instead of just saying "look I'm sorry I trusted this source too much and I'll try to correct this in the future", he went on endless mental gymnastics to show how under some over-the-top perspective he was right all along.
That made it for me.

EDIT: This list.

hTFoFZb.png
 
Aaaah, I didn't see that, indeed it was off :D Yeah he goes crazy sometime. He did the same thing with 16 cores ryzen. He was kind of right, but the timing was off, and still he can't handle some criticism.
 
Uh ? The specs were right but the prices were off, no ? And he was one of the first talking about chiplets if I'm correct.

I believe he has sources, but sometimes he overinterpret things.
ryzen_3000_adoredtv_specs.png

Prices are wrong, release dates are wrong, TDPs are mostly wrong, boost clocks are wrong, base clocks are wrong, some name are ok (seems to be a guess based on Ryzen 1000/2000 series). Relation between names and core-configs is also wrong. Educated guess would be closer to real specs. Even WCCFTech, which hasn't a very good reputation here, had better leaks.
 
Sounds like Intel took the easy and cheap way out, multi die GPU means it's consumer variant will have terrible gaming performance, especially that it needs very high skill in writing drivers.

The multi die strategy will meet much more success in compute or AI, though at 500 watts it seems their architecture have terrible efficiency.
 
Status
Not open for further replies.
Back
Top