Intel ARC GPUs, Xe Architecture for dGPUs [2022-]

That just makes me remember my first few computers that didn't require a heatsink. I don't think I had a heatsink on a computer that I owned until I got my first 386DX in the early 90's (I couldn't afford a 486 in the 80's a PC with those cost around 10k USD at the time, well over 20k USD in todays dollars and over 30k USD if inflation was calculated the same way it was in the 70's and 80's). And it was an absolutely tiny heatsink with no fan. :p

Regards,
SB
 
I don't think many people even had a 386 in the '80s. 486 was alien tech. By 1994 it was cheap though! Because competition.
 
Last edited:
RGT isn't the most reputable source for leaks IIRC, but it seems legit. And excluding the enthusiast part (while they still claim to stay in <=225W budget) goes pretty well with what Intel has said (there's still more to alchemist, battlemage is on schedule etc)
 
Finding the price/performance ratio compelling enough despite knowing of the bugs, performance issues and general lack of polish of this architecture I purchased an A750. Has been pretty frustrating to own over the life of the product, but I can say that the user experience is much improved on pretty much all fronts which is pretty much all you can ask for. I needed a card that was smallish and with relatively modest power draw to fit in an older mini ITX case and hardware AV1 decode was also something I wanted. At the time I bought in there wasn't anything else that checked all the boxes for me and the aggressive pricing pretty much clinched the deal.

It's been unfortunate that this forum has been closed for the entire time I've owned the product so hopefully business picks up in this thread again.
 
Finding the price/performance ratio compelling enough despite knowing of the bugs, performance issues and general lack of polish of this architecture I purchased an A750. Has been pretty frustrating to own over the life of the product, but I can say that the user experience is much improved on pretty much all fronts which is pretty much all you can ask for. I needed a card that was smallish and with relatively modest power draw to fit in an older mini ITX case and hardware AV1 decode was also something I wanted. At the time I bought in there wasn't anything else that checked all the boxes for me and the aggressive pricing pretty much clinched the deal.

It's been unfortunate that this forum has been closed for the entire time I've owned the product so hopefully business picks up in this thread again.
talking about improvements..., a new update has been released with some nice performance enhancements, specially on certain DX11 games.

 
I don't think many people even had a 386 in the '80s. 486 was alien tech. By 1994 it was cheap though! Because competition.
A 386 was my first PC ever at the tender age of 8 years old! I played my first game, The Simpsons, on it and after finishing it I managed to accidentally delete it using a GUI for DOS. Learned what delete was that way (my main language is Portuguese and of course my English skills then were lacking haha).
 
February 18, 2024
A few months ago, we revisited Intel's user-facing drivers (including Arc Control) to see how they'd improved over one year of patches.
Now, we're benchmarking the Intel Arc lineup in more depth to see how they compete with NVIDIA and AMD graphics cards in 2024.

We're seeking to answer whether Intel Arc is any good (yet), and specifically, how it competes in the value side of the market with AMD. Large parts of this video will be simple discussion about the interesting way this battle is playing out: Namely, with AMD and Intel fighting over the modern "low end" while NVIDIA leaves it relatively untouched with new parts.
 
Intel responded with a thanks to Gamers Nexus. They are working on improving the performance on Starfield and GTA V (the issue turned to be a simple setting, mid or high MSAA, which explains how difficult it is to write drivers, 'cos any other game with MSAA works fine with ARC)

 
Really impressive results for Arc. I get that it's outside of the scope of these videos, but I would like to see DXVK results on these few problematic DX11 titles too to see if they can help mitigate their issues. Hell I use DXVK rather often for older DX9/11 titles on my 3060 enough.

They're actually progressing faster than I would have expected.
 
I still have the Serpent Canton NUC with the 16GB A770M. Have been using it a lot lately with Stable Diffusion where the VRAM comes in handy. Work sharing with the Xe iGPU on the 12700H in Adobe Premiere Pro (Deep Link Hyper Encode) also has always seemed to work. If MathWorks were to add Arc GPU support to MATLAB I could use it for work too, but it has always been Nvidia only so I can't ditch my 3080 yet.

I've noticed this NUC has been on sale for cheap lately if anyone is interested. The GPU is identical to the desktop 770 16GB except TDP limited to 125W, but you still get between 1700-1800MHz when it is maxed out and the NUC does a good job of keeping at at ~70-75c. Can be a bit noisy under stress though.
 
Back
Top