Intel ARC GPUs, Xe Architecture for dGPUs [2022-]

thanks for the explanation, didnt know that, as I though 0.5 SSAA would be equal to a 50% increase in pixels, giving it's usually an up-sampling effect. Knowing that, Metro 2033 Redux and Metro Last Light Redux and they run perfectly fine at native 4K 60fps.

On another note, DXVK does wonders for this GPU. In fact, it improves the performance of a few older games for nVidia and AMD GPUS running on DX11 and previous DirectX versions.

What I wanted to see and expected it would solve many issues. Highly recommend you use dxvk-async branch as well btw to significantly reduce the initial shader compilation stutter.

And this is achieved by only copying two files to the game's folder where the main game's executable is: d3d9.dll and dxgi.dll.

d3d9.dll is all you should need for DX9 games, dxgi.dll and d3d11.dll is what you need for D3D11 games.
 
HOW TO USE DXVK WITH ANY GAME ON WINDOWS
THIS TECHNIQUE ALSO IMPROVES PERFORMANCE OF MANY GAMES RUNNING ON NVIDIA AND AMD GRAPHICS

It's sooooooooooooooo simple. 4 steps.

1. Go to the page of DXVK or DXVK ASYNC (thanks to @Flappy Pannus for the help, and download and unzip the tar.gz (you might need 7zip to unzip it) file containing the binaries.


https://github.com/Sporif/dxvk-async/releases (DXVK ASYNC version significantly reduces the initial shader compilation stutter)

2. After unzipping there are two folders, x32 and x64, each containing the corresponding DirectX 32 bits or DirectX 64 bits versions of a DirectX API (11, 10, 9, etc).

vv0KCCU.jpg


3. https://www.pcgamingwiki.com/wiki/Home PCGamingWiki is your friend. Search for the game you want to run with DXVK, scroll down and look for the DirectX API it uses, and if it is a 32 bits or 64 bits binary.

0UXAeJ4.jpg


9hlx28I.jpg


4. If the game like in the example runs on DirectX 9 32 bits, then go to the x32 folder of DXVK and copy the files d3d9.dll (x being the DirectX version number) and dxgi.dll (this might not be necessary for DX9, but use it with any other version, thanks to @Flappy Pannus for the tip) to the directory where the main executable of the game is (this is very important, it doesn't necessarily has to be the root directory of the game, it must be where the game's executable is).

Ee0CESK.jpg


That's it! Run the game and enjoy.

EDIT (TO USE THE ADVANTAGES OF DXVK ASYNC --thanks to @Flappy Pannus )
There is another step you need to use with dxvk-async btw: Create a dxvk.conf file in the same folder as the .dll's, and add:

dxvk.enableAsync = true

To enable asynchronus shader caching. Just the dll's alone will not do it.
 
Last edited:
New to the fun game of DLL injection with older PC gaming? Ever try Alias Isolation?

I think you should compare games with both DXVK and the normal D3D9on12 and watch for any visual differences. Beyond RE4 apparently being quite broken with D3D9on12. :D
 
Last edited:
What I wanted to see and expected it would solve many issues. Highly recommend you use dxvk-async branch as well btw to significantly reduce the initial shader compilation stutter.



d3d9.dll is all you should need for DX9 games, dxgi.dll and d3d11.dll is what you need for D3D11 games.
you are awesome, thanks. I added those details to the miniguide on how to run any DirectX -below DX12- game on Windows using DXVK.

As I mentioned, DXVK also improves the performance of quite a few games for nVidia and AMD cards. Although that's not always the case, there are many examples.

GTA IV


Several improved games in this video, going from The Witcher 2 to Final Fantasy XIII etc. The guy is using a GTX 1650.


God of War, the video's author mentions how it helps all AMD cards.

 
That's it! Run the game and enjoy.

There is another step you need to use with dxvk-async btw: Create a dxvk.conf file in the same folder as the .dll's, and add:

dxvk.enableAsync = true

To enable asynchronus shader caching. Just the dll's alone will not do it.

God of War, the video's author mentions how it helps all AMD cards.

This was true for many DX11 games previously, however that video is 7 months old - this was well before AMD's new DX11 driver. That has significantly improved the CPU bottleneck in DX on midrange/lower-end CPU's now for Radeons, so dxvk is not really as necessary for these problem games on AMD GPU's anymore. Fixing AMD's CPU-bottlenecked DirectX drivers was a big reason DXVK on Windows started getting noticed.


On dxvk use as a whole across various GPU's:

Certainly dxvk can provide notable uplifts on some games in some scenarios, even on systems with stable DX9/11 drivers. For example on my I5-12400 and 3060, some areas in ME1 Legendary can have fps dip below 60fps at any res, likely due to single-threaded bottlenecks caused by the crappy DX9->DX11 migration EA did for the remaster. Dxvk can uplift those same problem areas to 120+ fps. Those are the likely areas where you may see the most benefits; games with extremely poor GPU utilization due to threading bottlenecks.

However, there are some caveats you should know as well that I've run into from my year+ of experimenting with it:

Compatibility. Bear in mind DXVK was created for Linux gaming first and foremost as part of Proton. As such, the bulk of its compatibility is focused on that platform. DXVK on Windows is basically off-label usage. If it works, hey great - but it's really not 'supported'. As such, there are a number of games where it will work perfectly in Linux, but fail to load on Windows. Arkham City and Origins are two such games that just crash when trying to run dxvk with the DX11 path (works on DX9). So there's really no place to go to report bugs on Windows, just gotta hope the next release happens to fix it.

Multiplayer. Shouldn't be too much of a surprise as you're replacing .dll's, but dxvk will not work with anti-cheat software. There is varying levels of success with this depending on the game and how aggressive its anti-cheat implementation is, but even it you get into a match without incident, there is the possibility it may be detected as a hack.

Latency. This can vary significantly per game, but ime dxvk has worse latency than DX when using Vsync. Just in general, but also because being Vulkan, you cannot take advantage of other latency-reducing options for DX9/11 titles, such as Low Latency or Fast Sync. Far less of a concern on VRR displays where you can just disable vsync of course.

Edit:

HDR: Almost forgot this. No HDR, and not available through SpecialK or AutoHDR on Windows 11 either. Prob not a huge loss for many on most PC displays but worth mentioning.
 
Last edited:
New to the fun game of DLL injection with older PC gaming? Ever try Alias Isolation?

I think you should compare games with both DXVK and the normal D3D9on12 and watch for any visual differences. Beyond RE4 apparently being quite broken with D3D9on12. :D
sure I did, Alias Isolation is superb. My only gripe with it is that it didn't work very well with Special K (HDR injector). Both ran correctly together but then the game became very sttutery.
 
Last edited:
sure I did, Alias Isolation is superb. My only gripe with it is that it didn't work very well with Special K (HDR injector). Both ran correctly together but then the game became very sttutery.
Yeah unfortunately Windows11 AutoHDR also breaks Alias Isolation. MotherVR doesn't work with it either.
 
Yeah unfortunately Windows11 AutoHDR also breaks Alias Isolation. MotherVR doesn't work with it either.
does AutoHDR enable when you are playing Alien Isolation? That's new, because it never worked for me a few months ago, as much as I wanted to, it's the reason why I resorted to Special K.
 
does AutoHDR enable when you are playing Alien Isolation? That's new, because it never worked for me a few months ago, as much as I wanted to, it's the reason why I resorted to Special K.
I've had problems with it not enabling before, but it does indeed work at the moment. Alias Isolation causes it to turn into a smeared mess though.
 
@Cyan if you disable vysnc and use the frame rate cap in Rivatuner Statistics that's bundled with MSI Afterburner it should completely remove all of the frame pacing problems you have in certain games.
 
Last edited:
Someone needs to try this DXVK hack on the 2007 release of Crysis and see if it helps with the CPU performance.

So I tried this on 2007 release of Crysis.

The game behaved weird though as when using the hack the game booted in DX9 mode according to Rivatuner and the games built in stats.

But I had the Very High graphics options available to select which in DX9 mode are greyed out and not accessible so the game can't possibly have been in DX9 mode so I'm not if the hack is even working correctly.

Anyway, zero performance improvement between the games native x64 DX10 render and the DXVK hack.
 
Someone needs to try this DXVK hack on the 2007 release of Crysis and see if it helps with the CPU performance.
Dunno if someone managed to get it working, I couldn't for now. The original works with this GPU, although no matter the resolution, the game caps at 24 fps even in the menus. This is 4K, but the same happens at 1024x768, 1600x1200, 720p, etc.

IGSPGPd.jpg


ourBBUG.jpg


Some other tested games.

FEAR 2. 4K 60 fps, power consumption 60W. Super smooth.

FEAR, won't launch without DXVK. With DXVK it boots up and runs pretty well at 4K. Graphical glitches (leaflets and decals appearing/disappearing, although fully playable). The game tells you when setting it up that the GPU has very little memory, that 4K is toom much. :D It's a classic and graphics look a bit bland nowadays.

Crysis Remastered. This is a curious beast. A message appears that Raytracing will be disabled because it uses an old version...

pNwv6vH.jpg


....and the option is effectively greyed out.

However, ingame if you change the global settings to whatever setting you prefer, like Very High or High or whatever, then the magic happens and the raytracing option is there for you to enable. 👌

RT at "Can it run Crysis?" setting and 4K with the rest of the settings to High or Very High is quite demanding, but there you have it the Intel ARC running the game as fast as it can and while not impressive, the game is playable. The DLSS option appears greyed out, of course, but it'd be ideal to use it with such a crazy Raytracing setting.

Crysis 2. 4K 60fps. Looks great. Nothing more to say.

Changing the render to DXVK doesn't improve performance.

Crysis 3. Default renderer 4K 40fps, every setting at max. A very little optimization and you get 4K 60fps in a jiffy. Switching to DXVK doesn't improve performance it's more or less about the same.

Dragon Age Origins. 4K 60fps no problem.
 
Last edited:
So I tried this on 2007 release of Crysis.

The game behaved weird though as when using the hack the game booted in DX9 mode according to Rivatuner and the games built in stats.

But I had the Very High graphics options available to select which in DX9 mode are greyed out and not accessible so the game can't possibly have been in DX9 mode so I'm not if the hack is even working correctly.

Anyway, zero performance improvement between the games native x64 DX10 render and the DXVK hack.
which performance numbers did you get? As I mentioned I haven't managed to get it working with DX10 -the x64 version, which GoG uses by default-. I copied the three DXVK files (for the x64 version) related to DirectX 10:

d3d10.dll
d3d10_1.dll
d3d10core.dll


and dxgi.dll, but a message appears saying that you need certain component of DX11, and then the game doesn't load.
 
FEAR, won't launch without DXVK. With DXVK it boots up and runs pretty well at 4K. Graphical glitches (leaflets and decals appearing/disappearing, although fully playable). The game tells you when setting it up that the GPU has very little memory, that 4K is toom much. It's a classic and graphics look a bit bland nowadays.

FEAR works with DGVoodoo2 and then you can get AutoHDR with it.
 
FEAR works with DGVoodoo2 and then you can get AutoHDR with it.
thanks! That's another wrapper/injector to try then.

A video published a few hours ago on how to improve performance with DXVK (like he mentions, this benefits other GPUs as well)

 
Last edited:
FEAR works with DGVoodoo2 and then you can get AutoHDR with it.
imho, it's from limitations where greatest innovations take place. I.e. how nVidia made possible the impossible (real-time raytracing), or how wrappers like DXVK or DGVoodoo 2 can improve the performance and add new features to older games, like the ability to have Auto HDR.

That's a great discovery, since it works very similarly to DXVK -really simple stuff like copying the files to the directory where the main game executable is-, but it even goes further in certain areas. From the webpage linked below:

dgVoodoo 2 is a graphics wrapper that converts old graphics APIs to Direct3D 11 or Direct3D 12 (as of version 2.7) for use on Windows 7/8/10.

Fixes many compatibility and rendering issues when running old games on modern systems as well as enabling the usage of various overrides and enhancements.

Enables the use of third-party tools, such as ReShade, to enhance or improve the gaming experience.


 
Given how Vulkan is so integral to the Linux gaming experience and Arc has the best performance on modern APIs, I'm surprised that Linux testing or DXVK testing on Windows is done more for Arc.
GTA4 for example, with DXVK almost get "fixed" with modern GPUs
yup, Intel wants to just use low level APIs (DirectX 12 and Vulkan).

Intel has just opted for a fairly logical compromise solution, which is to make a hardware that does without legacy burdens and opting for solutions that already exist and are viable (and AMD and Nvidia also know it very well and will end up doing it too, like Microsoft which in fact has active developments on it too). We have been using wrappers of different types and origins for many years, and they work.

Relying on translation software is more sensible, and easier to maintain, than further encumbering a driver or even the operating system, and the result can even be optimal, a patch over a patch compromises stability and robustness, it complicates things. No one loses anything, everyone wins, and companies enter this game "sponsoring" and collaborating, as well as hiring personnel, and other projects will be freely made, but there they are, companies cannot ignore that advantage.

This can be somewhat seen in this video from Linus Tech Tips with 2 Intel engineers -linked the video at that very moment, 2:13 on, section "Why is DX12 better?"-):

 
I.e. how nVidia made possible the impossible (real-time raytracing),
Seriously though, this is how history gets rewritten. People repeat same false information long enough for it to become the truth. Caustic did it first on FPGA, Imagination bought them, improved on it and made ASIC out of it and then integrated it into a GPU. Years before NVIDIA did anything. Only difference is NVIDIA managed to push it into markets with force.
 
Back
Top