Intel ARC GPUs, Xe Architecture for dGPUs [2022-]

Intel throws minor shades at game artists!

In game development, it is not uncommon for artists to accidentally stack copies of duplicate geometry in the same location, or to transform large numbers of objects to the origin, below the ground, to hide them from view.
In a rasterization context, this sort of sloppiness, while wasteful, will generally not cause catastrophic failures, because Z-buffering can remove redundant pixels, and performance degrades linearly with vertex count.
In ray tracing, a large number of primitives placed in exactly the same position can easily result in a Timeout Detection and Recovery (TDR) if even a single ray happens to pass through that region. This happens because the spatial hierarchy degrades to a flat primitive list, which results in a traversal cost that is orders of magnitude higher than normal.

 

One Arc Xe Core can’t pull a a lot of bandwidth across the memory hierarchy. VRAM bandwidth is especially poor. An AMD 6600 XT WGP or Nvidia RTX 3060 Ti SM can get 63 or 34.4 GB/s from VRAM, respectively. A Xe Core’s 8 GB/s makes it look like an old CPU core in comparison. Latency is definitely a culprit here, but there’s more to the story. A single CU from AMD’s old HD 7950 can also pull several times more bandwidth from VRAM, even though it has similar VRAM latency. Cache bandwidth isn’t great either, suggesting that Arc’s Xe Core has limited memory level parallelism capabilities compared to other GPUs.

At high occupancy, Intel’s Arc A770 finally shines, with competitive bandwidth throughout its memory hierarchy. Intel has done a good job of making sure their shared L2 cache and memory controllers can scale to feed large GPU configurations. However, Intel’s architecture will require high occupancy workloads to really shine, and high occupancy is not a guarantee.
 
So some of the poor performance in older games might not be as driver related as thought, it might the older games just don't load the GPU up enough to make best use of it's caches and bandwidth.
 
Awesome article. Reminds me some old hardware.fr deep dive.
The BW is so low in this situation, terrascale 2 level, I wonder if it's more a kind of hardware bug they couldn't fix in time rather than a design choice.
 
The article says they weren't able to get resizable BAR or PCIe 4.0 working for A770.... I wonder what's up there. That's supposed to be crucial for the card.
 
The article says they weren't able to get resizable BAR or PCIe 4.0 working for A770.... I wonder what's up there. That's supposed to be crucial for the card.
Resizable BAR is a mobo thing, as long as it is enabled in the mobo, the GPU just uses it. It is crucial yes, PCIe 4.0 though isn't, as stated by Intel:


and benchmarks that show that for the GPU PCIe 3.0 vs PCIe 4.0 makes no difference:

 
GPU has been finally delivered :)

mKu4ddb.jpg
 
Resizable BAR is a mobo thing, as long as it is enabled in the mobo, the GPU just uses it. It is crucial yes, PCIe 4.0 though isn't, as stated by Intel:


and benchmarks that show that for the GPU PCIe 3.0 vs PCIe 4.0 makes no difference:




Yep but this is what the Chips & Cheese article said.
Unfortunately, we weren’t able to get the A770 set up with PCIe 4.0 or resizable BAR
They had resizable BAR working with the other cards though.
 
Yep but this is what the Chips & Cheese article said.

They had resizable BAR working with the other cards though.
Just want to say, they were using my A770 to do the article, which was running on a i9 9900K with REBAR enabled, it was a communication error on my side because I originally said to them it was refusing to boot with REBAR enabled so they took that as meaning it wasn't enabled during the tests. I did some troubleshooting and asking Intel reps I know and they told me to turn off CSM in the bios which fixed the issue. All tests in this article were run with REBAR enabled1666284188746.png1666284118909.png
 
Just want to say, they were using my A770 to do the article, which was running on a i9 9900K with REBAR enabled, it was a communication error on my side because I originally said to them it was refusing to boot with REBAR enabled so they took that as meaning it wasn't enabled during the tests. I did some troubleshooting and asking Intel reps I know and they told me to turn off CSM in the bios which fixed the issue. All tests in this article were run with REBAR enabled

Oh that's great to know. Thanks for clearing it up.

Though I had been hoping that it wasn't at its best. :D
 
Last edited:
Just want to say, they were using my A770 to do the article, which was running on a i9 9900K with REBAR enabled, it was a communication error on my side because I originally said to them it was refusing to boot with REBAR enabled so they took that as meaning it wasn't enabled during the tests. I did some troubleshooting and asking Intel reps I know and they told me to turn off CSM in the bios which fixed the issue. All tests in this article were run with REBAR enabledView attachment 7307View attachment 7305
good to know Intel helped you. It might depend on the mobo. From my experience, when I tried to enable Resizable BAR in the bios, a message appeared saying that in order to do that you must disable CSM. The mobo I have is okay but not the most advanced.
 
good to know Intel helped you. It might depend on the mobo. From my experience, when I tried to enable Resizable BAR in the bios, a message appeared saying that in order to do that you must disable CSM. The mobo I have is okay but not the most advanced.
I have a Z390 Arorus Master and it has no such warning unfortunately 🤦
 
I have a Z390 Arorus Master and it has no such warning unfortunately 🤦
that seems to be a pretty high quality mobo. How is your experience with the GPU using new and old games?

Initial impressions:

POSITIVES (DIFFERENT/INTERESTING) 👌
- You can enable Adaptive Tessellation by default for all games thus they don't use unnecessary polygons.
- CMAA antialiasing can be forced instead of MSAA and offers good quality. I enabled it 'cos it consumes less resources than MSAA and the games run well. Interesting Intel article about CMAA. https://www.intel.com/content/www/u...-morphological-anti-aliasing-cmaa-update.html
- It allows you to manually choose the maximum power consumption. I liked this a lot. Default value is 190W max, which is little more than what my GTX 1080 consumed (170W), and the maximum is 228W.
- The in-game overlay is super complete, it's also verypleasant to look at.
- The Intel Graphics interface, as usual with Intel, imho, is snappier and pleasant to watch than the competition (seen the interface of my RX 570, Vega cards, Radeon 7, etc, and of course nVidia's control panel).
- The packaging is really nice, kinda less angular and less aggressive. Looks premium.

NOR HERE NOR THERE 😐
- Default colour looks different compared to the GTX 1080 (HDR on in the OS settings)

WHAT I DON'T LIKE 😤
- Unlike nVidia, I don't see the option to set the colour space to Full RGB or Limited RGB. It's more of a curiosity 'cos the TV auto adapts to that, but still....
- A very rainy and cold day, the GPU is at 48ºC. It's not much, but the GTX 1080 was probably at 44º under the same conditions, if not less.
- Power consumption under normal use of the OS stays at 34W.
- You can create custom resolutions, but you cannot select the Hz for them, unlike nVidia -never tried that with my RX 570 so I can't tell about AMD-.

Overall, things look promising with this gpu, and when Battlemage comes out Intel has the potential to be one heck of a contender in the GPU space.
 
Last edited:
GPU has been finally delivered :)

Nice. I'm still waiting for my A750 from an Intel competition "before they're even available" :rolleyes:

Curious to try UE5 and DXVK, there also seems to be DLSS2XeSS by the dev who worked on similar patches for FSR.

Not sure what I think of the performance yet, I know 3D Mark isn't very representative but surely every IHV optimizes for it so there should be some potential in these cards. Blender support seems ok, I wonder if Intel's hardware RT support will land before AMDs.

There is a newer driver that mentions A Plague Tale and some settings that may help with idle power.
 
that seems to be a pretty high quality mobo. How is your experience with the GPU using new and old games?
Haven’t had much time with a card yet just started a new job but I did some tests and posted it on my twitter. I’ll link them below there’s a lot so I’ll put it behind a spoiler tag

 
Some testing done -gosh, how grateful I am I have fiber now instead of internet radio, so I can uninstall and download, re-download games easily:

- Resident Evil Village with RT effects all turned up to the max, native 4K, 60fps, is one of the games that has impressed me the most in my life. At times it looks like CGI. I'm not exaggerating.

- Resident Evil 2 Remake. also with RT enabled, 4K. With RT it's like playing a different game even though I know the game almost by heart. This RT thing now I understand when they tell you that "it's the future". After all, it's the most natural lighting you can find in any game.

- Divinity Original Without Enhanced Edition. Of my favorite games. Perfect performance at 4K -simple graphics-, with and odd error in the menus -corrupt characters along a diagonal line splitting the display diagonally-. The corrupted diagonal line disappears while in game.

1666409665582.png

-Skyrim Enhanced Edition. Everything maxed out -default settings recommended by the game- and 4K but it doesn't run fine. It rumns well and suddenly stops for a moment. I imagine it has to do with the 64bits render targets and things that they don't use for anything substantial, but I didn't even touch the settings -I left everything by default-, I just wanted to start it.

- Return To Castle Wolfenstein, 2000-2001 game. It works. If you change the resolution, it gives an error (something along the lines that something is missing from OpenGL) and the game exits. Upon returning the resolution you had selected before is fully functional. I use an HD Widescreen mod for this game.

The native 4K RTCW performance with that mod didn't surprise me, it performs like the GTX 1080 and it's not 60fps. Occasional graphic artifacts are seen. Perfectly playable.

- Expeditions Rome. 4K, maxed out. Perfect performance.

- Commandos 3 HD. Very good also in everything.

- Doom 3 won't start. This happened to me sometimes with the GTX 1080. Especially when changing a display, or the screen refresh rate of a display, having HDR enabled, etc. I guess it will work fine, if I delete the game config files.

- Age of Empires 2 Definitive Edition has semi-visible menus, a bit weird. It looks great at 4K but has occasional moments of lag. Age of Empires 1 Definitive Edition renders well, although it has the typical graphical error in the menus -the diagonal line-. It works perfectly fine in game -no diagonal line, good performance-.
 
FIFA 23: Good. 4K60 maxed out. 190W power consumption.

NBA 2k23: Good. 4K 60fps with 16x MSAA Antialiasing (CMAA off) all maxed out. If you set CMAA on, at 16X AA during the presentation, the entire screen looks like covered in a depth of field effect, you see nothing. When the game starts it looks fine, although at 16x AA with CMAA on sometimes there is some odd ghosting. CMAA doesn't appear to be compatible with MSAA x16. It works perfect with MSAAx2, x4 and x8 though.

Need for Speed Hot Pursuit. Perfect. 4K 60fps all maxed out.

Grim Dawn (x64, DX11), great performance at max settings 4K. Main menu looks strange but everything is perfectly legible, the typical -for Intel it seems- diagonal line with corrupted pixels appears in the menu but it disappears in game.

Grim Dawn DX9 renderer. It looks good at 4K maxed out settings, although being DX9 it looks nothing like the DX11 version, it lacks many effects -it's not the GPU fault-. Performance is not good though. 1440p improves things, but it's not perfect performance either. 1080p is fine. Fully playable in any case.

Garfield Kart Racing. Native 4K works fine. Badly optimized game that at 4K did not run well on the GTX 1080, and you had to set graphics to medium, with the Intel you can set everything at max and 4K, 100W of consumption and it runs super smooth.

Shadows Awakening. 4K max settings. Perfect performance. 185W consumption.

Tell Me Why. Spectacular. 4K max settings. Smooth like silk.

Alien Isolation. 4K max settings. Runs like a dream, this was impossible on the GTX 1080 at 4K. 85W consumption.

Diablo 3. Runs well but it's not perfect. It's smooth but it's kind of twitchy. 4K, 75W consumption.

Sonic All Stars Racing Transformed. Solid but not perfect 4K performance. 60fps. Very low energy consumption.

Star Wars Battlefront II. 4K 60fps, super smooth in DirectX 11. Awesome looking game. If you set the game to run in DirectX 12 it renders nothing but a blue background and you can't do anything, you see icons of other players and a landscape as if you were floating on a cloud. A rare rare case. Because in DX12 is where the A770 stands out, but it's the other way around in this case. DX11 wins.

Mass Effect Andromeda. Meh. 4K 60fps very good performance at max settings. But it has a problem, this little screen appears:
1666412618877.png


Apparently this has also happened with nVidia although they fixed it and that's why I found a solution here: https://answers.ea.com/t5/Mass-Effe...load-DirectX-error-on-Windows-10/td-p/6021461

In general, most games are running fine for me. Old games can be a hit and a miss though, 'cos some of them run at 30-40fps at 4K for no apparent reason, seems like the GPU prefers big chunks of data being fed.
 
Back
Top