Intel ARC GPUs, Xe Architecture for dGPUs [2022-]

I would expect A-series have same problem with older CPUs. If not, that would mean A and B series would have complete different driver sets as code base basis.
Has anyone made these kind of tests with A570 and A770?

@Cyan : IF A-series has same problem, you would get quuuite bit boost just by updating rest of the system and keeping A770.
 
From Hub's B570 review:

1737098265654.png

Star wars Outlaws runs actually better with B570 and slower CPU than with B570 and the high end CPU... B580 again stays almost exactly same.
Either the test was somehow flawed or the driver optimizations are bit so so and needs work.
 
Gamers Nexus tested B580, RTX4060, RX7600 and B570 with three different CPUs and with their game selection the results (of B580 and B570) were more consistent than Hub's. Only one game showed clear performance drop with slower CPU. That does not mean that the overhead would not exist, but it does mean that it is depending more variables than just CPU.

 
I would expect A-series have same problem with older CPUs. If not, that would mean A and B series would have complete different driver sets as code base basis.
Has anyone made these kind of tests with A570 and A770?

@Cyan : IF A-series has same problem, you would get quuuite bit boost just by updating rest of the system and keeping A770.
not that I know of. But it seems that the situation is the same for ARC GPUs, for whatever reason, as they explain in this Reddit post, which has 2 years.


t's a phenomenon that the GPU's performance scales with CPU performance even in scenarios where the CPU is far from being bottlenecked.

This information has not been widely spread as it would stain intel's reputation and bring more difficulty to Arc's development if otherwise.

I think quite a few Redditors on this subreddit are aware of it and have verified the problem on their own systems. I have seen it being discussed here for quite a long time.


I'm really thinking of getting a cheap GPU, RX 6600 or RX 6600XT or Intel B570 which is cool -Intel is my favourite company so I favour them if I can-, to put on this computer (Ryzen 3700X and Intel A770) and let my nephews play on it.

And then take the base of a modern computer with a more advanced CPU and put the Intel A770 there and play on it until Celestial, Druid, or RTX 6000 or if AMD makes something interesting.

For now, what I do want to have is a monitor of 360Hz or more, but 360Hz is enough for me, and play the old and recent games at 360fps. And those who can't run at that framerate, just use Lossless Scaling and enjoy, but I'm being spoiled too much by 165Hz in all games to want anything other than high fps.

Motion clarity above all else. It's what has surprised me the most after many years of seeing everything -from the original 3D to the RT, 4K gaming, 1440p etc-.
 
From Hub's B570 review:

View attachment 12878

Star wars Outlaws runs actually better with B570 and slower CPU than with B570 and the high end CPU... B580 again stays almost exactly same.
Either the test was somehow flawed or the driver optimizations are bit so so and needs work.
they published a new patch these days, nothing out of the ordinary apparently. This seems to be so random, and quite odd, especially after reading your most recent post.

I hope they figure it out, for future Intel GPUs. Additionally, they must improve the use of energy when the GPU is idle, it's a flaw of both Alchemist and Battlemage videocards.

Intel is doing better GPUs than CPUs these days, imho. It's been 2 years with the A770 and I'm very happy with it, it ranks really high -in fact it's the best- on my favourite GPUs I've ever had.

From best to worst:

Voodoo Monster 3DFX -by no means the best but it had the biggest impact at the time-
A770 16GB
GTX 1080
RX 570
2xGTX 1060

One of my favorites is the Matrox G400 32MB MAX AGP but it didn't perform very well no matter how cool it was. I also had the Voodoo 3000 16MB PCI and it performed better although there were other much better GPUs at the time, it's just that I was used to Voodoo and Glide.
 
I used to have Matrox G400 16MB single head. It was absolutely blast on it's 160 euro price back then. I ran it on AMD Athlon Slot A Thunderbird 700MHz which ensured enough cpu power to push it on it's limits. Friend of mine with Pentium 3 at 800Mhz and Geforce 256 Sdr stopped talking to me for few weeks after seeing live how it ran. Not sure why though. :) loved that card. Too bad that Parhelia flopped. It was 10 months late and was clocked to 220 instead of designed 275.

Between G400 and 9700 non-pro (which I got instead Matrox next gen) I had Ati Radeon All in Wonder. That was definetely worst supported gfx card I have ever owned. The drivers were really bad. When gaming side was okayish, the video side was broken and vice versa. Even S3 Virge / Diamond Stealth 3D 2000 was a lot better.

I am definetely looking Intel as update path from 3070ti. That needs the g31 to be released though with max price of 2x B580 finnish retail price. :)
 
Last edited:
I am definetely looking Intel as update path from 3070ti. That needs the g31 to be released though with max price of 2x B580 finnish retail price. :)
I was very disappointed that didn't get any pointers from CES.

What I would like to see (although not holding breath)
A380 (possibly phase it out as igpu pretty competitive now)
B570 & B580
C750 & C770
We know xe3 (celestial) is coming think it's 2H in panther lake. They made a strategic decision to release Bxxx cards lot later than the igpu.
So release the dgpu around same time this time. Their in a lot better position to do it now.

I wouldn't be surprised if battlemage doesn't scale up affordably, to what B750 & B770 would need to be to swim in that shark infested waters.
We've seen many times where chip doesn't scale up or down as well so have couple ones in stack at same time.
 
I was very disappointed that didn't get any pointers from CES.

What I would like to see (although not holding breath)
A380 (possibly phase it out as igpu pretty competitive now)
B570 & B580
C750 & C770
We know xe3 (celestial) is coming think it's 2H in panther lake. They made a strategic decision to release Bxxx cards lot later than the igpu.
So release the dgpu around same time this time. Their in a lot better position to do it now.

I wouldn't be surprised if battlemage doesn't scale up affordably, to what B750 & B770 would need to be to swim in that shark infested waters.
We've seen many times where chip doesn't scale up or down as well so have couple ones in stack at same time.
It would be very surprising see Celestial dGPU in 2025. More like H1/2026 is possible. The first variant of Battlemage (BMG-G10) was reported to be in lab testing in late August 2023. It is possible though that it missed the mark quite bit and design got major overhaul after that as the engineering samples of G10 did not show up in shipping manifests anymore after Q1/2024. Or then it was just test mule for the architecture and was not even planned to be released. Who knows.

Huge BMG-G31 non R&D shipment shipping manifest was published 3rd of December and listing had date 29th of October. So if it is / will be cancelled, the plug has been pulled quite late stage. Not as late as AMD, which seems to be adjusting Navi48 based cards release date again. According to Videocardz, RX9070 series release date will be in March: https://videocardz.com/newz/amd-confirms-radeon-rx-9070-series-launching-in-march

There's plenty of time and space in market for Intel release G31 during Q1 or even early Q2.

However it is possible that Intel and AMD are holding things back to see U.S. new administration moves on tariffs. Percent based tariff (in scale of Trump's words of 50-75%) would tip the price a lot in U.S. market and effect on sales a lot. But there's definitely something weird going on with all these delays.
 
Last edited:
Back
Top