Intel ARC GPUs, Xe Architecture for dGPUs [2018-2022]

Status
Not open for further replies.
I admit I didn't actually count the difference and see if it's really faster on average, but if it isn't it's close.
It costs the same and it's close in DX12/RT but will probably lose in other APIs, and that's from Intel's own benchmarks.
Doesn't look like a very attractive price/perf to me.
 
I admit I didn't actually count the difference and see if it's really faster on average, but if it isn't it's close.
View attachment 7093
According to this, the A770 which is their top dog is 14% faster on average than the 3060 in RT-heavy scenarios+raster.

unknown.png

Whatever the case, it has no shot at competing against the 2080 Ti. Maaaybe the 2080/3060 Ti at best and even that's stretching it.
 
It costs the same and it's close in DX12/RT but will probably lose in other APIs, and that's from Intel's own benchmarks.
Doesn't look like a very attractive price/perf to me.
In Intels own numbers (just listened to the video too rather than just browse through benchmark numbers) on DX12 and Vulkan A750 averages 5-6 % and higher if you drop Vulkan out apparently
Here's normalized DX12 + Vulkan
1664498712031.png
 
I don't know if they were slated to originally release earlier, but it'd be tragic if intel end up shelving these because they missed the boat. Anything launched in the past year would've been gobbled up by market and now they're left competing against 3060 and quite possibly Navi33 in a few months which would make a mincemeat out of these. The rasterization advantage of the latter would be big enough that improved RT would not matter, even if intel drivers somehow get upto par.
 
I don't know if they were slated to originally release earlier, but it'd be tragic if intel end up shelving these because they missed the boat. Anything launched in the past year would've been gobbled up by market and now they're left competing against 3060 and quite possibly Navi33 in a few months which would make a mincemeat out of these. The rasterization advantage of the latter would be big enough that improved RT would not matter, even if intel drivers somehow get upto par.
Only shelving happening to these is appearing on store shelves in less than two weeks.
 
with what kind of RT/ML performance?
For MM performance it's competitive with Ampere's highest end. It's 1024 ops/cycle per Xe core which is double Nvidia's 512/cycle per SM (without sparsity), there's a big difference in Xe core count/SMs (32 vs 84 for the 3090Ti) but Intel also clock about 20% higher so it's similar in the end. However this excludes sparsity which doubles Nvidia's numbers again and for their price (3060, 3050) it doubles their MM performance again excluding sparsity. For RT given the limited info we have so far it looks competitive with Ampere on a per Xe core/SM basis but that's marketing material, probably a lot of room for driver improvement given the shambles it's been so far etc but we don't have long to find out.

It's a big and likely painful first step into a big new market but I hope they stick it out for several generations at least, a competitive third player will hopefully be great for consumers. If the worst comes to the worst at least they have a really good looking reference card. Also hope they keep being open and forthcoming about their products and arch, yes it's marketing but it's a good change and I'm sure a lot of people find it informative, AMD and Nvidia could learn a thing or two
 
Last edited:
The only reason to buy this GPU is for the novelty sake, other than that you are buying subpar DX11 performance, emulated DX10/DX9/DX8 games with all the bugs and performance issues associated with emulation, and non existent support for anything older than DX8.

OpenGL games are a toss up in the air, drivers features too, bugs, artifacts are rampant, and worst of all, unreliable history and future, as the fate of this whole Intel endeavor is under threat of ending, and then your card could be nothing more than paperweight.
I think emulating legacy APIs is the proper way to deal with the problem. For old games, subpar performance is good enough, and it's better they concentrate their effort on DX12/VK, where it's actually needed.
I'm not that worried about driver issues either. This will only get better, and considering the very high compute power, we can assume this GPU will age well. I also expect good support from devs regarding new games. They just saved our business.

The only problem i see is the seemingly requirement on AGP4 for good performance. Most people don't have this.
Likely that's just a driver issue as well, but in the worst case this could also hint the GPU needs more communication with CPU for some bad reasons. Idk.

We will see how it goes, but those who said 'Cool Intel joins the party - i'll still buy my favourite brand but it's fine Intel helps to lower prices.' might change their mind if they're smart.
This is beyond all expectations in every measure.

The king is dead - long live the king... ;P
 
For old games, subpar performance is good enough
Artifacts and bugs are not. Games not even launching is not, low performance is definitely not.

The only problem i see is the seemingly requirement on AGP4 for good performance. Most people don't have this.
ReBAR is mandatory to have good performance on these GPUs, which kinda defeats the purpose of their low price anyway, and no it's not a driver problem, it's a memory controller problem "Intel answered this already", so this only gets fixed next gen.
 
ReBAR is mandatory to have good performance on these GPUs, which kinda defeats the purpose of their low price anyway, and no it's not a driver problem, it's a memory controller problem "Intel answered this already", so this only gets fixed next gen.
That's a pity.
However, what counts is you can now get a new cutting edge gaming PC for 1000. \:D/
 
I think he/she meant shelving the GPU line entirely after these models and quit the market.
They won't. People who even suggest that seem to forget that if Intel would cut this short, they would just save on designing couple chips less and it would lose them more than just Arcs. They would still need to continue developing the architecture for iGPUs and data center
 
I think he/she meant shelving the GPU line entirely after these models and quit the market.

Yeah, the first 'these' referred to whatever happens next with intel with consumer dGPUs of course.

The second 'these' was for 3060/intel-equivalents. Navi33 should put them out of their misery sooner or later. It'd be great if intel could continue and be competitive, but I just don't see the signs right now and to reiterate, they've missed the boat.
 
I think he/she meant shelving the GPU line entirely after these models and quit the market.
I think there's a middle ground where Intel give up on the high end and stick to low end to mid range discrete GPUs and laptop GPUs. According to MLiD that's what his Intel sources have said is happening (obviously take with a lot of grains of salt).
 
Yeah, the first 'these' referred to whatever happens next with intel with consumer dGPUs of course.

The second 'these' was for 3060/intel-equivalents. Navi33 should put them out of their misery sooner or later. It'd be great if intel could continue and be competitive, but I just don't see the signs right now and to reiterate, they've missed the boat.
It almost sounds like you think Intel only had one chance ever and they ruined it because their first discrete gen was delayed :rolleyes:

Also, Alchemist being late doesn't mean anything for future gens.
I think there's a middle ground where Intel give up on the high end and stick to low end to mid range discrete GPUs and laptop GPUs. According to MLiD that's what his Intel sources have said is happening (obviously take with a lot of grains of salt).
Pretty sure he has covered all the bases already, "sources" saying they'll axe everything (never going to happen, they still got datacenter and igpus which need architecture development). different "sources" saying they'll cut all but midrange/lowend/laptops. Only confirmed source is Intel who has said BS on all these rumors.
 
Pretty sure he has covered all the bases already, "sources" saying they'll axe everything (never going to happen, they still got datacenter and igpus which need architecture development). different "sources" saying they'll cut all but midrange/lowend/laptops. Only confirmed source is Intel who has said BS on all these rumors.
Has Intel explictly denied the rumours? The only thing I have seen is from Raja who said they were not "helpful".
 
Status
Not open for further replies.
Back
Top