Intel ARC GPUs, Xe Architecture for dGPUs [2022-]

Which sort of makes sense. As new player in market, the resources will be easily shifted towards future, not past. Competition has lived thru the past times, so they have thw knowledge already for older cpus.
I think this makes little sense imo. These are firmly budget GPUs, how many people buying budget GPUs are also using modern CPUs with modern chipsets? A good many budget buyers are getting by with older AM4 products.

If this were a high end card it's whatever as nobody is trying to run Zen+ or Zen 2 with a 4090 (or equivalent) but I bet tons of people would be interested in trying to build a console-tier system with a 3600 and a B580 tier card (such as a 4060).
 
Very likely PR fluff but: https://www.theverge.com/2025/1/6/24337345/intel-discrete-gpu-ces-2025

I wonder what this means for G31, Celestial/Xe3 & Druid/Xe4? I hope that patent about chiplets that was pointed out a few months ago and wonder if it's still on? I am not sure if G31 will still appear (especially Vs N48, GB205/GB203) because financially it'll be hard to find a niche.
Given that they will need to continue shipping iGPUs with every CPU they sell, they certainly won't stop investing in developing the next gen IPs. Productizing the IPs to a discrete product is still a cost but if the IP is competitive, it should be relatively low cost. So I think we'll continue to see dGPUs from Intel, but perhaps only mid range or lower.
 
quite interesting. I wouldn't expect these new GPUs to be made for older CPUs, it makes no sense whatsoever, but even a 5600 CPU isn't best suited for them compared to the RTX 4060, which is quite surprising, and wouldn't make it a candidate for me -I have 3700X CPU-. Since I have an A770 already, I'm more interested in Celestial and Druid tbh.

I am quite interested to know it this also affects the B570 and the rumoured B770.
 
Last edited:
Probably much worse on faster cards.
the graph below exemplifies the issue. Whether it's caused by the drivers (maybe the A770 should be re-reviewed, just in case it has the same problem), or there is other issue, the 4060 would be a much better buy for people like me with a mediocre CPU nowadays, which was decent in the day, like my 3700X.

The 4060 remains very stable performance wise, decreasing logically depending on the CPU, but still good. However, there is almost a 300% deficit in performance with a Ryzen 3X00 CPU compared to the most advanced modern CPUs, it's unacceptable.

HlQw8I1.png
 
the B570 seems to have a very low power consumption, which I really like.

That being said, this data seems odd. Are the games CPU bound with this card when using lower end CPUs?

I mean, if you compare the performance of Native resolution vs XeSS you don't get a single frame of performance improvement using XeSS on the Ryzen 5600. What's happening here?

gaDi9mV.jpeg
 
What I find mind-blowing about the B570 is power consumption.

It performs like the 7600XT but the 7600XT is consuming 160W in Xenua Hellblade 2 and in Doom Eternal, but the B570 it is consuming 70W. It's amazing.

In the new CoD, the power consumption is also very low while it matches the performance of the 7600XT. But the 7600XT is clocking in at 160W and the Intel between 70W and 80W, compared to the 4060's 110W-120W. -those numbers are a constant throughout the video-.


I'm a sucker for power eficiency and if this graphics card was priced at 200€ I'd buy it in a jiffy.
 
ah ok, that explains it.

On another note, after all that's happening to Intel as of late, they seem to live in a very odd world. Their GPU department is doing much better than their CPU department.

I mean, Intel lives in some kind of utopia where CPUs consume more than 300W but their GPUs consume 70W. Shrug.
 
ah ok, that explains it.

On another note, after all that's happening to Intel as of late, they seem to live in a very odd world. Their GPU department is doing much better than their CPU department.

I mean, Intel lives in some kind of utopia where CPUs consume more than 300W but their GPUs consume 70W. Shrug.
It's pretty unfortunate that these $200-$250 cards only perform like they do in reviews when paired with a $500 CPU. Not sure what reviewers should do about this, other than what HUB did and test it on multiple CPUs. But then what is the verdict? Any conclusions when tested on a 9800X3D are borderline pointless, since you don't know which games will take a huge hit on a CPU that would actually be used with these cards.
 
Last edited:
It's pretty unfortunate that the $200-$250 cards only perform like they do in reviews when paired with a $500 CPU. Not sure what reviewers should do about this, other than what HUB did and test it on multiple CPUs. But then what is the verdict? Any conclusions when tested on a 9800X3D are borderline pointless, since you don't know which games will take a huge hit on a normal CPU.
the ball is in Intel's court. Only them know what is happening.

Still, in the review I posted with the B570 showing an amazingly low power consumption and performing really well -quite a few times even better- against a 7600XT 16GB and the RTX 4060, at 1080p and 1440p resolutions, while beating the competition to a pulp with an average power consumption of 70-80W, the guy was using a Ryzen 5700 -which is the next generation of my 3700X, so relatively old and affordable these days-.
 
What I find mind-blowing about the B570 is power consumption.

It performs like the 7600XT but the 7600XT is consuming 160W in Xenua Hellblade 2 and in Doom Eternal, but the B570 it is consuming 70W. It's amazing.

In the new CoD, the power consumption is also very low while it matches the performance of the 7600XT. But the 7600XT is clocking in at 160W and the Intel between 70W and 80W, compared to the 4060's 110W-120W. -those numbers are a constant throughout the video-.


I'm a sucker for power eficiency and if this graphics card was priced at 200€ I'd buy it in a jiffy.
That's impressive, but the real competition will be Navi 44.
 
the ball is in Intel's court. Only them know what is happening.

Still, in the review I posted with the B570 showing an amazingly low power consumption and performing really well -quite a few times even better- against a 7600XT 16GB and the RTX 4060, at 1080p and 1440p resolutions, while beating the competition to a pulp with an average power consumption of 70-80W, the guy was using a Ryzen 5700 -which is the next generation of my 3700X, so relatively old and affordable these days-.
That's pretty impressive. I didn't look but did they show CPU power consumption as well? I'm wondering if the CPU is getting hit harder on the Intel GPUs, which could offset some of the observed power gains. Especially on a modern Intel CPU (space heater).
 
That's pretty impressive. I didn't look but did they show CPU power consumption as well? I'm wondering if the CPU is getting hit harder on the Intel GPUs, which could offset some of the observed power gains. Especially on a modern Intel CPU (space heater).
yeah, they show the CPU's power consumption at all. There doesn't seem to be an issue, the CPU's power consumption with the B570 is normal compared to the other cards, still, that 110-120W power consumption average for the RTX 4060, the 80W average power consumption of the B570, and the 160W-170W power consumption average for the 7600XT throughout the video is something that other reviews are missing.

The B570 is the card by the rightmost side.

ot7JZpB.png


PoHtKQU.png


On top of that, the guy didn't remove the plastic from the card and even so it was running cool. 🤷‍♀️
 
Last edited:
What I find mind-blowing about the B570 is power consumption.

It performs like the 7600XT but the 7600XT is consuming 160W in Xenua Hellblade 2 and in Doom Eternal, but the B570 it is consuming 70W. It's amazing.

In the new CoD, the power consumption is also very low while it matches the performance of the 7600XT. But the 7600XT is clocking in at 160W and the Intel between 70W and 80W, compared to the 4060's 110W-120W. -those numbers are a constant throughout the video-.
It looks like they're not reporting it correctly in software. Or it's only measuring the GPU die rather than full board power, like AMD used to do. Reviewers that directly measure the voltage and current in and out of the GPU (e.g. TPU and GN) are measuring about 150 W.

 
It looks like they're not reporting it correctly in software. Or it's only measuring the GPU die rather than full board power, like AMD used to do. Reviewers that directly measure the voltage and current in and out of the GPU (e.g. TPU and GN) are measuring about 150 W.

that sounds a bit more realistic, that's a tad higher than the 4060. Not that impressive then. Still ok taking into account the price of the videocard, just not impressive.
 
Last edited:
Back
Top