Intel ARC GPUs, Xe Architecture for dGPUs [2022-]

Raja Koduri interview.


Are you going to be sticking to the roadmap for Battlemage and Celestial?

Raja Koduri:
Yes, absolutely.

Are GPU design cycles also around 24 to 30 months long, as with CPUs for Intel?

Raja Koduri:
Not really. Doing a new architecture is always very difficult. New architectures take 3-4 years but after that once you have a baseline, iterating on it is quite fast. Since we are coming from nothing, we want to iterate fast so that we can catch up to the competition in every segment.

You have stated that Arc has challenges with games that make excessive numbers of draw calls, and you'd said that driver updates might mitigate that. So what's the timeline on that? Is performance coming up to the mark with your expectations?

Raja Koduri:
Yes, absolutely. The two APIs that are the most challenging for draw calls are DirectX 9 and DirectX 11. The DX9 driver update should be happening relatively soon and DX11 shortly thereafter. They are imminent. There'll be some nice announcements. It will make a huge [difference], we're not talking five or ten percent. In some cases, it'll be much, much larger.

So what are the major constraints that you face [in developing and popularising Arc]?

Raja Koduri:
On the gaming side, the install base of old games is amazing. DirectX 9 is an API that launched in 2002. It's a 20-year-old API and there are games that haven't been touched for more than a decade but are very, very popular. Some of them actually have bugs. It isn't just our driver; there were wrong uses of the API but they were bug-compatible with older AMD or NVIDIA drivers. So we have to make them work. The user doesn't really care about what an API is, right? They just plug in an Arc card and run a game and say it doesn't work. So it's our responsibility to make it work, no matter where in the stack of properties [an incompatibilty lies]. That's the long tail that we had to check, but we pretty much went through 95–96 percent of all those issues on our path to launch. Now we're on the last 1–3 percent and we have managed to hammer through these releases.

We're seeing modern GPUs consuming ridiculous amounts of power, even though manufacturers have moved to more efficient process modes. 600W and 800W power supplies are becoming the norm now. Will Intel also follow this trend?

Raja Koduri:
Performance per Watt, or delivering higher performance at lower power, is my top priority. There will always be someone with some skill who can say “I'm going to give you more juice”, but my focus is lower power. The other issue I find with just increasing power and bragging about benchmarks is that while it's good from a marketing standpoint, [there is a limited] number of PC users who can just buy such a card and plug it in. It dramatically reduces your overall market, right?

It's incredible how complex PCs have become. I don't do as much DIY as I used to maybe 5-10 years ago, but recently I put two PCs together using both my hardware and competition's hardware, and even just getting all these connectors in was like “Hmmmm!” I actually live this, and I found it hard. It was funny, I had to go look at YouTube videos! I really still love the DIY culture. That's what makes democratisation easy, and I'd love to find ways to continue that. I've been thinking about how to do that in a more modular fashion, and I think the PC needs a reboot.

When XE first came out, I think a lot of us were excited that the integrated graphics on basic CPUs would get a significant bump, but we haven't seen that yet. Where do you see the baseline for entry-level integrated graphics, and why hasn't that progressed significantly recently?

Raja Koduri:
Great question! In fact, that was always the plan. Meteor Lake was always the plan but what happened, and this is something we have publicly said, was delays in our Core CPU roadmap and 10nm process. We stayed on 10nm for a couple more generations [than intended]. Advanced XE graphics were on the next node, ready to go, but the Meteor Lake platform is the one that is going to ship that new Xe graphics with ray tracing and all those great features. So I can't wait for the world to see Meteor Lake, which will change the entire integrated graphics landscape.

Even for the low-end Pentiums and Celerons? For entry-level PCs and Chromebooks?

Raja Koduri:
Yes, you'll see that.
 
Last edited:
They did specify using native DX9 driver for some games now, though?

Yeah it wasn't entirely clear but looks like it, the way they talk about it they will vary the implementation on a title by title basis, which makes sense. Some games will use native DX9, some DX9on12, and some DXVK. Counterstrike may just be native DX9 at this point, maybe someone can load up Rivatuner and see exactly what API it's using on Arc.

As I mentioned when I was positing about Intel utilizing DXVK earlier, this may ultimately benefit everyone though as DXVK on Windows doesn't get that much attention in terms of specific bug fixes for it as compared to running under Linux. Intel now being invested in it may pay dividends for every Nvidia/AMD user as well who occasionally uses it to get around wonky DX9/11 implementations in some games.
 
Last edited:
Yeah it wasn't entirely clear but looks like it, the way they talk about it they will vary the implementation on a title by title basis, which makes sense. Some games will use native DX9, some DX9on12, and some DXVK. Counterstrike may just be native DX9 at this point, maybe someone can load up Rivatuner and see exactly what API it's using on Arc.

As I mentioned when I was positing about Intel utilizing DXVK earlier, this may ultimately benefit everyone though as DXVK on Windows doesn't get that much attention in terms of specific bug fixes for it as compared to running under Linux. Intel now being invested in it may pay dividends for every Nvidia/AMD user as well who occasionally uses it to get around wonky DX9/11 implementations in some games.
MSi Afterburner doesn't work with this game, for whatever reason. In the end what I did is changing to Windowed mode and a Direct 3D 9 appears at the top, although that says anything.

1xDK9aK.jpg


Talking of DXVK, I solved an issue on Bloodstained Ritual of the Night, a game I purchased on PC Gamepass, using DXVK. It is a DX 11 64 bits game and after launching it, a PC Gamepass window appears with a button saying "Let's go". You can see the game is running perfectly fine in the background of the popup window, but once you click on "Let's go", the HDMI/Displayport connection is lost, although the game is running, the music is on, it's just that the display goes pitch black as if the computer wasn't on.

Using DXVK that doesn't happen and you can play the game normally.
 
Kind of like this move by Intel, using DXVK. Kind of humbling for them in a way, getting that they can't get better than that for now, and maybe don't need too since DXVK is working pretty good already.
 
Kind of like this move by Intel, using DXVK. Kind of humbling for them in a way, getting that they can't get better than that for now, and maybe don't need too since DXVK is working pretty good already.
It's just one of the three they're now using, not the only one. They have native DX9 driver for some games, use DXVK for some others and D3D9on12 on rest
 
Yeah it wasn't entirely clear but looks like it, the way they talk about it they will vary the implementation on a title by title basis, which makes sense. Some games will use native DX9, some DX9on12, and some DXVK. Counterstrike may just be native DX9 at this point, maybe someone can load up Rivatuner and see exactly what API it's using on Arc.

As I mentioned when I was positing about Intel utilizing DXVK earlier, this may ultimately benefit everyone though as DXVK on Windows doesn't get that much attention in terms of specific bug fixes for it as compared to running under Linux. Intel now being invested in it may pay dividends for every Nvidia/AMD user as well who occasionally uses it to get around wonky DX9/11 implementations in some games.
I suspect that they are using DXVK 2.0 now in games like CS:GO. As I mentioned, I had an issue with Bloodstained: Ritual of the Night, a DX11 game, and I used DXVK 2.0 which fixed it. Thing is..., MSi Afterburner is exhibiting EXACTLY the same behaviour with Bloodstained RoTN and CS: GO. It stops working, it doesn't show any stats whatsoever.

When I launch Bloodstained: Ritual of the Night without DXVK 2.0 in native DX11 mode, MSi Afterburner shows the performance stats just fine. But once I copy d3d11.dll and dxgi.dll (you also need this one at least for this game, without it DXVK won't work) into the game's main directory where the executable is, the stats are gone. :unsure: (iirc, they worked when I used DXVK 1.0 to play many games btw)
 
I suspect that they are using DXVK 2.0 now in games like CS:GO. As I mentioned, I had an issue with Bloodstained: Ritual of the Night, a DX11 game, and I used DXVK 2.0 which fixed it. Thing is..., MSi Afterburner is exhibiting EXACTLY the same behaviour with Bloodstained RoTN and CS: GO. It stops working, it doesn't show any stats whatsoever.

I find I usually have to set the detection level of the process in Rivatuner to HIGH to have it show up in games that use dxvk. If you haven't done that, try it.
 
I find I usually have to set the detection level of the process in Rivatuner to HIGH to have it show up in games that use dxvk. If you haven't done that, try it.
thanks for the help. I didn't know that. I barely use MSi Afterburner but on non Steam games or games without an internal performance handler, it is a must. With detection level of process set to High, the MSi Afterburner overlay appeared in Bloodstained: Ritual of the Night showing it was running on Vulkan. No luck with CS: GO though.
 
Raja Koduri says their audience just want one power connector and that the sweet spot for GPUs is 200W to 225W.

https://www.tomshardware.com/news/intel-gpu-head-wants-one-power-connector

 
new drivers with support for The Witcher 3 Next Gen Update and other games.

https://www.intel.com/content/www/u...tel-arc-graphics-windows-dch-driver-beta.html

GAMING HIGHLIGHTS:

Intel® Game On Driver support on Intel® Arc™ A-series Graphics for:

  • The Witcher 3: Wild Hunt Next-Gen Update*
  • High on Life*
  • Conqueror’s Blade*
PERFORMANCE HIGHLIGHTS:

Game performance improvements versus the Intel® 31.0.101.3959 WHQL software driver on Intel® Arc™ A770 Graphics Products for:

Game
API
Resolution
Settings Preset
Performance Uplift
PUBG: Battlegrounds*
DirectX* 11​
1440p​
Ultra​
Up to 4%​


DEVELOPER HIGHLIGHTS:

  • This Intel® Arc™ software package includes support for optimized Microsoft DirectStorage* 1.1 with GPU accelerated decompression for developer integration.
KNOWN ISSUES:
• The Witcher 3: Wild Hunt Next-Gen Update* (DX12) Hairworks feature is not currently supported on Intel® Arc™ Graphics Products.
• Diablo II: Resurrected* (DX12) may cause system instability or application crash.
• Warhammer 40,000: Darktide* (DX12) may experience application crash during character selection.
• A Plague Tale: Requiem* (DX12) may experience application freeze and crash during gameplay.
• Conqueror’s Blade* (DX11) may experience corruption in benchmark mode.
• Call of Duty: Vanguard* (DX12) may experience missing or corrupted shadows during the Submarine mission.
• Payday 2* (DX9) may exhibit flickering corruption on specific water surfaces.
• System may hang while waking up from sleep. May need to power cycle the system for recovery.
• GPU hardware acceleration may not be available for media playback and encode with some versions of Adobe Premiere Pro.
• Blender may exhibit corruption while using Nishita Sky texture node.

INTEL® ARC™ CONTROL KNOWN ISSUES:
• Windows UAC Admin is required to install and launch Arc Control.
• Some applications may exhibit a transparent or blank window when CMAA is set to “Force ON” globally.
• A 1440p resolution selection in Arc Control Studio Capture may be unavailable when the display native resolution is 4K.
• Arc Control Studio Camera overlay position may not retain desired position and size after a system restart.
• The Arc Control Studio Camera tab may take longer than expected responsiveness upon the first navigation.
• Arc Control may report incorrect memory bandwidth value. Intel® Arc™ Control Performance Tuning (BETA):
• Intel® Arc™ Control Performance Tuning is currently in Beta. As such, performance and features may behave unexpectedly. Intel® will continue to refine the Performance Tuning software in future releases.
 
Last edited:
Neat find in TPU forums, a way to bypass the driver-imposed power limits:

 
PC building is probably the easiest it's ever been considering how the platforms, software and hardware in general "just work" these days and everything is available in an easy kit and Youtube can solve all your problems in seconds.
 
Last edited:
PC building is probably the easiest it's ever been considering how the platforms, software and hardware in general "just work" these days and everything is available in an easy kit and Youtube can solve all your problems in seconds.

PC has always been that easy.

The issue is there's too many standards, too many API's....to many of everything.
 
PC has always been that easy.

The issue is there's too many standards, too many API's....to many of everything.
Nah. The modern era started around 2004 I'd say. Prior to that hardware and software were much more problematic. Partly because a lot of it was cheap and broken but also because everything was evolving so rapidly. Today the hardware is incredibly more complex yet really just works.

And I'm pretty sure you don't want anything proprietary gaining a foothold.
 
Last edited:
Back
Top