Intel ARC GPUs, Xe Architecture for dGPUs [2022-]

Could you see if Arc can run this CFD benchmark, please -- https://github.com/ProjectPhysX/FluidX3D/releases
done. Why are you interested in knowing the result in this test if that's not asking much? Just curious...

FP16C

1IigJD8.png


FP32

GMWbDlh.png


19.661 tflops..., I thought peak processing power was around 17 teraflops.
 
Last edited:
done. Why are you interested in knowing the result in this test if that's not asking much? Just curious...


19.661 tflops..., I thought peak processing power was around 17 teraflops.
Intel has their reported clocks (and thus TFLOPS) pretty conservatively, it's the "average clock where they usually hang out", but depending on load the difference can be hundreds of MHzs in either direction (not sure if I have ever heard anyone reporting clocks going under in practice though)
 
I did those tests too on Vega 56:
1666460879769.png
I'm interested in 16 : 32 scaling, which seems similar on Intel and worth it.

Am i right with assuming no NVidia GPUs benefit from fp16?
I would be happy if there is at least benefit from reduced LDS usage or maybe less register pressure. If anyone can tell.

Edit: Reading about the project reveals fp16 only is about memory conpression, all arithmatic is done with fp32.
But Intel has twice fp16 tf over 32, so i guess it's similar to AMD, while NV lacks the ALU advantage.

Edit2: I see Turing had double fp16 rate, but neither do Ampere nor Ada. Too bad. Seems they concluded it's not worth it.
 
Last edited:
More tests with OLD GAMES or those that have a certain "age", and I'll leave it there.

THE GOOD AND WHAT WORKS EVEN IF IT DOESN'T WORK PERFECTY WELL

Quake 1, Quake 2, Quake 3, Quake 4. 4K 60 maxed out for Quake 4. The rest run fine, Quake 2 and 3 work perfectly although at their maximum resolution, being an old engine, is 2048x1536 or so.

Morrowind and Oblivion. Perfect. Morrowind accepts resolutions typical of old engines, the maximum being something like 2048x1536, Oblivion allows 4K -or any native resolution of your display- to be used.

Defense Grid Awakening, Defense Grid 2, perfect 4k all maxed out. The last one with Auto HDR.

Doom 1, Doom 2, Doom 3, Doom 3 Resurrection of Evil, Doom 3 BFG Edition all work perfectly. Doom 3 on pc gamepass does not start after launching the game but I imagine it has to do with the old config files of the GTX 1080.

Call of Juarez Dx10 benchmark (perfectly smooth).

Call of Juarez DX10 ingame (smooth but framepacing not good, 4K maxed out).

Call of Juarez Gunslinger. 4K maxed out. Good framerate but framepacing issues.

Age of Mythology. Perfect performance without graphic problems.

Ultra Street Fighter IV. Sheer perfection. 4K all maxed out, MSAAx8. Super stable 60fps. 60W consumption.

Street Fighter 30th Anniversary. Perfect.

Metro Last Light Redux. 4K maxed out. SSAA x 0.5 -that is, a little more than 4K native internally-. Smooth as silk. Auto HDR.

Metro 2033 Redux. Idem. I tried it with SSAA x 4 -which would be like internal 8K, 4 times 4K- and it works but of course, it's not 60fps at 8K.

Double Dragon. Perfect.

Arma: Cold War Assault. 4K maxed out. It works fine but has framepacing issues.

Resident Evil 4. It runs perfectly and performs well at native 4K. But once you play it suffers from many graphical glitches.

Sh2HNvn.jpg


qA1K1wL.jpg


EDIT: Might have to redo this entire part, 'cos a bug affecting the DirectX API of the GPU caused by some old game, corrupted DirectX during a session. When I restarted the computer, some of the mentioned games started to work fine. I guess the bug was caused by Call of Juarez DirectX version, though not yet 100% sure. However, it was after that that other games started to fail one after another.
 
Last edited:
I was hunting for a NUC for a home file and media server and ended up with this. Pricier than the smaller NUCs but was also keen to try Arc so thought why not, plus can use it to beam games around the house via Steam. It has a 16GB 770M (full G10 die as used in the 770 LE but is TDP restricted). My experience for those curious:

  • default Windows 11 VGA driver is very choppy (had to use this before I got Wifi working) with hitching and long pauses
  • earlier Arc drivers do not include Arc Control. I had to update to the latest 'launch day' driver to get it
  • no problems with video playback or Windows so 2D is fine
  • Idles at ~32W according to Arc Control
  • Tried Civ 6 first as it's a quick download. No problems in DX12 mode, haven't tried DX11 yet
  • 770M boosts to 2000MHz @ 0.996V by default. Memory running at 16GHz (512GB/s effective)

Looks like lots of issues in old titles as per Cyan above. Really interested to see how it progresses over time.

Edit: Arc overlay seems very confused between the Xe graphics included in the 12700H and the 770M. I've had to use HwINFO to properly see clocks and TDP data. 770M is TDP limited to 120W in balanced mode.
 
I was hunting for a NUC for a home file and media server and ended up with this. Pricier than the smaller NUCs but was also keen to try Arc so thought why not, plus can use it to beam games around the house via Steam. It has a 16GB 770M (full G10 die as used in the 770 LE but is TDP restricted). My experience for those curious:

  • default Windows 11 VGA driver is very choppy (had to use this before I got Wifi working) with hitching and long pauses
  • earlier Arc drivers do not include Arc Control. I had to update to the latest 'launch day' driver to get it
  • no problems with video playback or Windows so 2D is fine
  • Idles at ~32W according to Arc Control
  • Tried Civ 6 first as it's a quick download. No problems in DX12 mode, haven't tried DX11 yet
  • 770M boosts to 2000MHz @ 0.996V by default. Memory running at 16GHz (512GB/s effective)

Looks like lots of issues in old titles as per Cyan above. Really interested to see how it progresses over time.

Edit: Arc overlay seems very confused between the Xe graphics included in the 12700H and the 770M. I've had to use HwINFO to properly see clocks and TDP data. 770M is TDP limited to 120W in balanced mode.
as for old games, darn, I gotta redo the ENTIRE message with all those games that don't work in theory.

Apparently in the list there was some game that required DirectX and it corrupted DirectX for any game or app that needs it -many of the games I installed, started to install some version of DirectX themselves-.
I have noticed that bug because when launching the Epic Store I got a message that I had an unsupported version of DirectX. which never happened to me using said store.

So I restarted the PC and the Epic Store works perfectly for me now. :) So...., all those games that I was testing and seemed not to work, I found myself surprised by the fact that it was one game after another when it started to fail one, then everything failed. But I didnt think much of it.

While it needs better support for old games, the GPU works ok with most old games, it isn't an nVidia or AMD GPU in that regard, but still. It took me some time to test those games, sigh. :(

To give an example, one of the games that didn't work no matter what, Rise of Nations, worked perfectly after restarting, along with the Epic Store -which never had a problem-.
 
Also nice in-depth analysis by chips and cheese. Summary of the article from the AT forums:

Someone was saying the issue with performance in games like CS: Go seems like it's due to lack of parallelism. The micro-level benchmarks done by chips and cheese reflect that. In that particular instance the A770 was on the level of A380.

-Higher execution unit latency
-Low memory controller and cache performance, and bad at hiding them
-Difficulty scaling memory bandwidth with threads, yet it especially needs it to make up for above mentioned deficiencies.

So how much can be solved by software and how much by hardware? They also mention part of the problem is due to the "iGPU" mentality.

Some of the aspects are not just a generation behind, but two, three, or four generations behind. Some micro level tests put it on par with AMD's Terascale 2 GPU!

@Cyan: Yes I thought it was odd that it was mostly OK and then bam! Nothing working.
 
Am i right with assuming no NVidia GPUs benefit from fp16?
I would be happy if there is at least benefit from reduced LDS usage or maybe less register pressure. If anyone can tell.

Edit: Reading about the project reveals fp16 only is about memory conpression, all arithmatic is done with fp32.
But Intel has twice fp16 tf over 32, so i guess it's similar to AMD, while NV lacks the ALU advantage.

Edit2: I see Turing had double fp16 rate, but neither do Ampere nor Ada. Too bad. Seems they concluded it's not worth it.
Indeed, this CFD simulator uses lower data format for memory storage, i.e. during load/store operations. This might explain underutilization bottleneck cases for some GPU architectures, related to the memory subsystem. Here is a 1080Ti, gaining quite an advantage with FP16C packing:

1666516053312.png
 
Given how Vulkan is so integral to the Linux gaming experience and Arc has the best performance on modern APIs, I'm surprised that Linux testing or DXVK testing on Windows is done more for Arc.
GTA4 for example, with DXVK almost get "fixed" with modern GPUs
 
Updated from a previous post that showed several "old" games failing to run. Found the culprit and it was the original Call of Juarez, which installed a DirectX 9 version itself that rendered any DirectX call invalid, causing any game or app -like Epic Store app, WTF?- to fail at launch showing that the installed version of DirectX was not compatible.

GAMES RUNNING WELL (resolution set to 4K, 60fps, maxed out settings)
Supreme Commander Forged Alliance. Superb.

6sLbaBc.jpg


Guilty Gears XX. Runs like a charm.

fd8o8mA.jpg


NBA 2K Playgrounds 2. Perfect, no stuttering nothing.

OmzU3eA.jpg


Call of Juarez Bound in Blood. Running perfeclty fine. Intro shows Loading message during the story into, but then it runs. Smooth framerate and framepacing.

0qb1tDE.jpg


Rise of Nations. Perfect. Auto HDR on. Which is also why I don't show a picture, 'cos the capture is transformed from HDR to SDR and it looks really odd.

GAMES THAT WON'T LAUNCH (haven't searched for potential solutions but maybe most could run with DXVK or some setting from the command line, etc.)
Ultimate Marvel vs Capcom 3.
King of Fighters XIII.
Resident Evil 1 Remake, Resident Evil 0, Resident Evil Revelations, Resident Evil Revelations 2, Resident Evil 6
.
 
More tests with OLD GAMES or those that have a certain "age", and I'll leave it there.

Great reports btw, very interested to see these.

Call of Juarez DX10 ingame (smooth but framepacing not good, 4K maxed out).

Call of Juarez Gunslinger. 4K maxed out. Good framerate but framepacing issues.

Pretty sure they both have framepacing issues on Nvidia too btw, just general behavoir with the games. Need rivatuner/forced vsync to remedy.

Metro Last Light Redux. 4K maxed out. SSAA x 0.5 -that is, a little more than 4K native internally-. Smooth as silk. Auto HDR.

Actually no, it's an oddly named setting - 0.5 means half res, not "native + SSAA". So you're more likely rendering at 1080p internal. Turn SSAA off entirely to just render at native res.
 
GAMES RUNNING WELL (resolution set to 4K, 60fps, maxed out settings)
Supreme Commander Forged Alliance. Superb.

6sLbaBc.jpg

You could play the campaign and see if you can hit 100% GPU utilization. I think it will happen at 4K. It certainly does with the 1080 Ti in an intense battle.

Maybe grab this mod if you feel like it. ;)
https://www.moddb.com/mods/forged-alliance-enhanced-campaign-ai

Also, I was reading about it and DXVK and for awhile there were missing particle effects. I think they have that fixed now. But that sort of thing can go unnoticed and get mistakenly interpreted as better performance.
 
Great reports btw, very interested to see these.



Pretty sure they both have framepacing issues on Nvidia too btw, just general behavoir with the games. Need rivatuner/forced vsync to remedy.



Actually no, it's an oddly named setting - 0.5 means half res, not "native + SSAA". So you're more likely rendering at 1080p internal. Turn SSAA off entirely to just render at native res.
thanks for the explanation, didnt know that, as I though 0.5 SSAA would be equal to a 50% increase in pixels, giving it's usually an up-sampling effect. Knowing that, Metro 2033 Redux and Metro Last Light Redux they run perfectly fine at native 4K 60fps.

On another note, DXVK does wonders for this GPU. In fact, it improves the performance of a few older games for nVidia and AMD GPUS running on DX11 and previous DirectX versions.

Tbh, I am going to use DXVK for every single game running under DirectX 11 and below from now on, even if it the game runs fine at 4K 60fps.

Two games that weren't working or were showing graphics issues, are now working like a charm.

Resident Evil 4, which was running fine but showed graphics issues.., now with DXVK, corrupt graphics are no more. 4K 60fps, no stuttering at all with DXVK.

uB0QkWC.jpg


The King of Fighters XIII, which didn't launch at all.... is now running thanks to DXVK.

64M3Msq.jpg


lgmzkeA.jpg


And this is achieved by only copying two files to the game's folder where the main game's executable is: d3d9.dll and dxgi.dll. edit: for DirectX 9 games the dxgi.dll is not necessary, for any other version, you must use it -thanks to @Flappy Pannus for clearing that up.

Note: KoF XIII has 3 DX files in the main's executable directory called D3DX0_43.dll, D3DX9_43.dll and D3DX9_43_2.dll. The game will launch and work fine having those installed but when in-game, they cause weird effects like the background not being rendered and you can only see your characters, the rest of the image is a black background. So you gotta remove them, and everything will be fine.
 
Last edited:
Back
Top