Intel ARC GPUs, Xe Architecture for dGPUs [2022-]

btw, I just dont bother using Vsync in games. As of currently I am using a Intel technology called Smooth Sync.

It is a shader-based dithering that applies a filter to correct tearing. It is implemented at the driver level and is a way to correct misaligned frames with less input lag than V-Sync.

This is the well-known Pendulus demo running with Smooth Sync.

 
Only difference is NVIDIA managed to push it into markets with force.
There is no force involved, NVIDIA has been gradually experimienting with Ray Tracing on the GPU ever since they invented CUDA, they pushed it hard with Fermi through GPGPU/CUDA, and then again with Maxwell through various shadow and lighting implementations (VXAO, HFTS, VXGI), and even integrated those into games already.

With Turing, they developed the hardware (with the right mix of RT/FP32INT32 units) and the API (DXR, VulkanRT extensions), innovated on the denoising algorithms and developed many implementations (such as RTXGI, RTXDI, .. etc), then integrated the upscaling portion into the equation, so they did the legwork ahead of everybody, meaning they didn't have ass it, and they did it right. They went so far ahead of what Caustic and Imagination ever did .. so you see, developing the hardware alone is not enough, you have to do the complete ecosystem with it as well, that's when you can push anything onto the world, what they did is anything but force. Same thing happened with CUDA by the way.
 
Last edited:
"with force" meant more in terms of volume than anything else. Caustic/Imagination HWRT hardware together with open API to use it with did ship too, just not enough for anyone to care.
 
dgVoodoo 2 is a graphics wrapper that converts old graphics APIs to Direct3D 11 or Direct3D 12 (as of version 2.7) for use on Windows 7/8/10.

Dege's been involved with wrapping since Glide on Windows 98. Eventually got into Direct3D and DirectDraw wrapping with an iteration of DGVoodoo2.

Dege actively works with people on VOGONS, working around all sorts of quirks in old games.
https://www.vogons.org/viewforum.php?f=59
 
Last edited:
More games tested.

Pacer. Perfect 4K 60fps, every setting on blast processing.:ROFLMAO:
Might & Magic: Clash of Heroes. 4K 60fps maxed out -not that it matters, a Gameboy runs this-, perfect.
Dark Souls Remastered. 4K 60fps full throttle, perfect.
Death Rally (Classic). Perfect. OpenGL. Very old.
Age of Wonders III. 4K 60fps max settings, despite being DirectX9 -I was surprised it uses DX9 because it's not that old of a game-
Dark Souls 2 Scholar of the First Sin. 4k 60fps perfect, maxed out settins.
Age of Empires II (2013). Direct3D9. It works perfectly although DX9 is better with DXVK or DGVoodoo 2, but anyway, no problem.
Blazblue: Calamity Trigger. Perfect.
Blazblue: Continuum Shift Extended. Perfect.
Bioshock 2 Remastered. Perfect 4K 60fps maxed out.
VCuynEI.png

The game warns you with a strange message when launching it, but never mind, click Yes and there you go.
Bioshock Infinite. No weird messages. 4K 60fps ultra settings, 80W. Perfect in that sense. Native DX11 game. Texture streaming problems, I don't know if it's typical of the old Unreal engine because that already happened to me with the RX 570 and the nVidia GTX. With the A770 that engine's issue seems to be more noticeable. DXVK doesn't solve that and actually runs better in native DX11 than DXVK, if only for once. 110W of consumption with DXVK with stuttering vs 80W of consumption in DirectX 11 without any stuttering.
Crysis, original. @davis.anthony Correction from a previous post of mine where I mentioned that that the game was locked at 24fps. 4K maxed out everything and Antialiasing x 8, between 70 and 80fps. Not recommended to play this game with vsync off. Before, I got 24fps because the game menu allows you to choose the resolution, 4K for example, but if you enable Vsync, the game understands that it is 24 fps, which is the minimum that my TV supports, but it doesn't enable 60Hz, just 24Hz, bypassing the OS setting. :mad: This is a bug in the game, or it will be necessary to edit some .ini to set the Hz to 60. Don't care tbh, I have the remastered version with RT which is what I find interesting.

-sry the crappy contrast, I didn't bother adjusting the brightness in the game while testing things out-
KiPHG2l.jpg


iFVIYvh.jpg
 
Last edited:
Gotta install the new patch later. Even more performance is always welcome.


Intel published a new graphics driver update that fixes memory frequency issues on Arc A770 GPUs. With the new driver, the Arc A770 now runs at its rated 2,187 MHz GDDR6 memory frequency for a total output of 560 GBps of memory bandwidth.

The issue was discovered earlier this week by VideoCardz, who found several users on Github reporting lower-than-expected memory clocks on their A770 GPUs. The clocks were supposed to run at 2,187 MHz, but for some reason, the affected users found their GPUs operating with a flat 2,000 MHz memory frequency instead.

That might not sound like much, but the 187 MHz drop is quite substantial in reality and results in a 9% loss in clock speed and a whopping 17% drop in memory bandwidth, from 560 GBps to 512 GBps. This could lead to a noticeable reduction in gaming performance for games that heavily utilize the GPU's memory bus.
 
Last edited:
Gotta install the new patch later. Even more performance is always welcome.


Intel published a new graphics driver update that fixes memory frequency issues on Arc A770 GPUs. With the new driver, the Arc A770 now runs at its rated 2,187 MHz GDDR6 memory frequency for a total output of 560 GBps of memory bandwidth.

The issue was discovered earlier this week by VideoCardz, who found several users on Github reporting lower-than-expected memory clocks on their A770 GPUs. The clocks were supposed to run at 2,187 MHz, but for some reason, the affected users found their GPUs operating with a flat 2,000 MHz memory frequency instead.

That might not sound like much, but the 187 MHz drop is quite substantial in reality and results in a 9% loss in clock speed and a whopping 17% drop in memory bandwidth, from 560 GBps to 512 GBps. This could lead to a noticeable reduction in gaming performance for games that heavily utilize the GPU's memory bus.

Great news, although TomsHardware may want to check their math on the memory bandwidth loss. :)

2187mhz / 2000mhz and 560GBps / 512GBps both ~= 1.09 (9% drop)
 
good article and all, I only disagree with him in his final recommendation. Sure there is more wealth of knowledge about the RTX 3060, but if you are buying a RTX 3000 GPU, you are losing in features, 'cos DLSS3 is a no go, and maybe some other improvements nVidia might include in the RTX 4000 series that will not be present in the RTX 3060. Why are you going to buy an older GPU lacking certain features? If he said..., wait for the 4050, 4060, AMD RDNA 3, he would have a point. But recommending a RTX 3060 at this point is not a good idea, imho. Other than that, it's a decent article.
 
Have they managed to fix the idle power draw? That's the major thing preventing me from potentially getting one for my HTPC.

Regards,
SB

There were some adjustments to be done in Bios and Windows but still has higher draw as some didn't work for one model. This is the last bit of news I saw about it:


Users will need to go into their PC's BIOS and configure a pair of advanced PCI Express power management settings—the "Native ASPM" (or active-state power management) setting should be enabled, and the "PCI Express root port ASPM" setting should be enabled and set to "L1 Substates." You'll also need to set the PCI Express Link State Power Management setting to "maximum power savings" in Windows' advanced power options settings.

...

Testing from Tom's Hardware shows that with the settings enabled, Arc A750 power consumption at idle dropped from 37.3 W to 15.5 W, a significant drop. The same settings didn't seem to have an effect on an Arc A770 card, though it's unclear whether this is a motherboard bug, a GPU hardware or firmware or driver problem, or something else.

Intel may be able to address the issue in the long-term with driver or firmware updates for the Arc A-series GPUs, but the troubleshooting article doesn't make it sound very likely. Intel says that the company "will be looking at making optimizations in future generations," which makes it sound like we'll need new hardware to address the issue decisively.
 
First two games tested with native XeSS.

Redout 2. 4k 60fps maxed out, XeSS Balanced or Quality or Ultra Quality. I left it on Balanced. You don't see a single jaggie, good job by Intel as seen in the Shadow of Tomb Raider benchmarks, even when upscaling from 720p in some resolutions, they remove the jaggies from the image.

Ghostwire Tokyo. Native 4K Raytracing on High, no Screen Space Reflections (thankfully so), nor XeSS nor FSR2, TSR nor FSR 1.0. Game runs at 20-25fps. Ghostwire Tokyo supports, FSR 1.0, XeSS, TSR and FSR 2.0.

With XeSS Performance, 4K 44fps, RT High. XeSS Balanced, 35fps or so.

With FSR2 Ultra Performance, 4k 60fps, RT High.

With TSR, 4K 35fps.

With FSR 1.0, I haven't noticed, I don't care, although I've tried it.

A game that puts this mid-tier GPU to the limit, with Raytracing on High, looks godlike at times, Raytracing reflects the animations of displays on the building in realtime, crazy stuff.

Video cinematic, not realtime, I think?

M1LVZ9U.jpg


Realtime ingame cinematic, 4K, maxed out settings to 11, XeSS Balanced. This really impressed me, also lighting is much better in game than in video cinematics.

UOqQp6l.jpg


Ingame gameplay, maxed out settings to 11, XeSS Balanced. HDR to SDR destroys the image here with that crazy blue glow, but anyways.

6XDd7HL.jpg


4K RT High, other settings. Textures and view distance set to default, not some crazy value. FSR 2.0 Ultra Performance, the equivalent to FSR 1.0/XeSS/DLSS Performance mode. 60fps.

ZzRXb03.jpg


4K RT High, other settings. Textures and view distance set to default, XeSS Performance. 44fps.

tHMnbPk.jpg


4K RT Medium, XeSS Performance.

lIcr9ig.jpg


4K RT Low, XeSS Balanced.

F9FOzZO.jpg


4K RT High, FSR 2.0 Ultra Performance. Real time reflections from the screen, it's impressive in motion.

gYFcN2t.jpg
 
Last edited:

Seems to support XeSS in CoD MW2.
it doesn't seem to be working though. That being said, while I don't use screen capture at all, I can tell you that Intel Arc Control is an odd thing indeed.

I just use it to change settings and that's it. It works in the weirdest way. You launch it, a message about administration rights appear, you click on Yes and then the window of the program appears.

However, if you click outside of the window, it doesn't behave like a typical window at all, it just closes. So you click again on Intel Arc Control icon, same administrative rights message, rinse and repeat. The Intel overlay shows interesting info but I barely use it 'cos of certain details like the guy showed.

Also I am having the same problem on my TV, which is 4K 120Hz VRR capable but VRR is disabled. I got a HDMI 2.1 cable, and use the HDMI4 input port of my TV -which is the HDMI 2.1 one-, but no VRR support via HDMI 2.1 for now.
 
Last edited:
According to some comments on the video, XeSS and VRR are controlled outside of the game in the Intel control panel application.

I don't know if that's true...
 
Back
Top