Intel ARC GPUs, Xe Architecture for dGPUs [2022-]

It looks like they're not reporting it correctly in software. Or it's only measuring the GPU die rather than full board power, like AMD used to do. Reviewers that directly measure the voltage and current in and out of the GPU (e.g. TPU and GN) are measuring about 150 W.

that sounds a bit more realistic, that's a tad higher than the 4060. Not that impressive then. Still ok taking into account the price of the videocard, just not impressive.
 
Last edited:
I would expect A-series have same problem with older CPUs. If not, that would mean A and B series would have complete different driver sets as code base basis.
Has anyone made these kind of tests with A570 and A770?

@Cyan : IF A-series has same problem, you would get quuuite bit boost just by updating rest of the system and keeping A770.
 
From Hub's B570 review:

1737098265654.png

Star wars Outlaws runs actually better with B570 and slower CPU than with B570 and the high end CPU... B580 again stays almost exactly same.
Either the test was somehow flawed or the driver optimizations are bit so so and needs work.
 
Gamers Nexus tested B580, RTX4060, RX7600 and B570 with three different CPUs and with their game selection the results (of B580 and B570) were more consistent than Hub's. Only one game showed clear performance drop with slower CPU. That does not mean that the overhead would not exist, but it does mean that it is depending more variables than just CPU.

 
I would expect A-series have same problem with older CPUs. If not, that would mean A and B series would have complete different driver sets as code base basis.
Has anyone made these kind of tests with A570 and A770?

@Cyan : IF A-series has same problem, you would get quuuite bit boost just by updating rest of the system and keeping A770.
not that I know of. But it seems that the situation is the same for ARC GPUs, for whatever reason, as they explain in this Reddit post, which has 2 years.


t's a phenomenon that the GPU's performance scales with CPU performance even in scenarios where the CPU is far from being bottlenecked.

This information has not been widely spread as it would stain intel's reputation and bring more difficulty to Arc's development if otherwise.

I think quite a few Redditors on this subreddit are aware of it and have verified the problem on their own systems. I have seen it being discussed here for quite a long time.


I'm really thinking of getting a cheap GPU, RX 6600 or RX 6600XT or Intel B570 which is cool -Intel is my favourite company so I favour them if I can-, to put on this computer (Ryzen 3700X and Intel A770) and let my nephews play on it.

And then take the base of a modern computer with a more advanced CPU and put the Intel A770 there and play on it until Celestial, Druid, or RTX 6000 or if AMD makes something interesting.

For now, what I do want to have is a monitor of 360Hz or more, but 360Hz is enough for me, and play the old and recent games at 360fps. And those who can't run at that framerate, just use Lossless Scaling and enjoy, but I'm being spoiled too much by 165Hz in all games to want anything other than high fps.

Motion clarity above all else. It's what has surprised me the most after many years of seeing everything -from the original 3D to the RT, 4K gaming, 1440p etc-.
 
From Hub's B570 review:

View attachment 12878

Star wars Outlaws runs actually better with B570 and slower CPU than with B570 and the high end CPU... B580 again stays almost exactly same.
Either the test was somehow flawed or the driver optimizations are bit so so and needs work.
they published a new patch these days, nothing out of the ordinary apparently. This seems to be so random, and quite odd, especially after reading your most recent post.

I hope they figure it out, for future Intel GPUs. Additionally, they must improve the use of energy when the GPU is idle, it's a flaw of both Alchemist and Battlemage videocards.

Intel is doing better GPUs than CPUs these days, imho. It's been 2 years with the A770 and I'm very happy with it, it ranks really high -in fact it's the best- on my favourite GPUs I've ever had.

From best to worst:

Voodoo Monster 3DFX -by no means the best but it had the biggest impact at the time-
A770 16GB
GTX 1080
RX 570
2xGTX 1060

One of my favorites is the Matrox G400 32MB MAX AGP but it didn't perform very well no matter how cool it was. I also had the Voodoo 3000 16MB PCI and it performed better although there were other much better GPUs at the time, it's just that I was used to Voodoo and Glide.
 
Back
Top