Intel ARC GPUs, Xe Architecture for dGPUs [2018-2022]

Status
Not open for further replies.
I just hope intel keeps up with it. The more players we have for discrete gpu options the better off we are as consumers. So I am hopeful that even if intel can't get the top of the market figured out they could at least be more competitive in the lower end. If they can make good $100-$300 boards or gpus for laptops , yea maybe gamers wont go out and say oh i will get this over amd or nvidia. But perhaps enough people out there who are more casually into games will get better access and go with an intel board.
 

INTEL-ARC-A750-1440p-ULTRA-DX12_videocardz.jpg
 
I wanted Intel to do well with their GPU, I really did. I think I bought in to the hype early because of the team they were putting together and all of the people I knew on it. I should have seen this coming over a year ago and didn't which has gotten me angry with myself for falling for the hype, because this feels EXACTLY like the Vega launch did to me and I'm seeing tons of similarities in the way they're rolling it out.

I don't want Intel to fail at GPUs, I want a third competitor, I just don't think Arc is going to do it and I don't think Intel is gonna put up with the shenanigans they pulled launching it and will nix the Battleaxe series. :(
 
I wanted Intel to do well with their GPU, I really did. I think I bought in to the hype early because of the team they were putting together and all of the people I knew on it. I should have seen this coming over a year ago and didn't which has gotten me angry with myself for falling for the hype, because this feels EXACTLY like the Vega launch did to me and I'm seeing tons of similarities in the way they're rolling it out.

I don't want Intel to fail at GPUs, I want a third competitor, I just don't think Arc is going to do it and I don't think Intel is gonna put up with the shenanigans they pulled launching it and will nix the Battleaxe series. :(

I mean it looks competitive with the 3060. So if its price competitive from their end that they can profit at those price points it should be fine.
 
If they can get their drivers over the LOL hump. :(
And I include XeSS with that.
We know of at least 2 games that has it implemented riftrbreaker & dolman, possibly 3 but don't think it's been mentioned for Hitman only RT optized.

Yet they've not allowed it to be enabled.
They could even not include DP4a library if they don't want it running on other GPUs yet for marketing reasons.

Just think software side expecially is way behind.
 
Man, Intel fucked up way harder than I thought they would before diving into the trenches near Izyum.
This is worse than Vega.
I just hope intel keeps up with it.
Please no, Intel sunk lotta money into other useless vanity projects beforehand.
We need good Xeons, not shit GPUs.
they could at least be more competitive in the lower end
Navi thirty-three says hello!
Low-end is all about having the best raw PPA possible, and Intel has the worst (156mm^2 G11 versus 203mm^2 N33 hahahaha holy shit same node too) out there.
 
Last edited:
I mean it looks competitive with the 3060
Only in select DX12 tites, the rest is horrendously bad.

Also you don't evaluate this product based on performance alone, as many outlets already discovered, frame pacing in games is bad, visual bugs and artifacts run rampant, games refuse to even launch, not to mention the myraids of game breaking bugs. RT performance is also not mentioned by Intel, implying a problem there as well.
 
Only in select DX12 tites, the rest is horrendously bad.

Also you don't evaluate this product based on performance alone, as many outlets already discovered, frame pacing in games is bad, visual bugs and artifacts run rampant, games refuse to even launch, not to mention the myraids of game breaking bugs. RT performance is also not mentioned by Intel, implying a problem there as well.
Select? over 40 games is quite a bit more than just "select titles" and it included several games which haven't been properly optimized (according to Intel) yet.
RT performance has been benchmarked already, at least in lowend the loss is smaller than AMDs and NVIDIA doesn't have RT support in same category to compare. Scaling remains to be seen.
 
If they can get their drivers over the LOL hump. :(

Man, Intel fucked up way harder than I thought they would before diving into the trenches near Izyum.
This is worse than Vega.

Please no, Intel sunk lotta money into other useless vanity projects beforehand.
We need good Xeons, not shit GPUs.

Navi thirty-three says hello!
Low-end is all about having the best raw PPA possible, and Intel has the worst (156mm^2 G11 versus 203mm^2 N33 hahahaha holy shit same node too) out there.

Only in select DX12 tites, the rest is horrendously bad.

Also you don't evaluate this product based on performance alone, as many outlets already discovered, frame pacing in games is bad, visual bugs and artifacts run rampant, games refuse to even launch, not to mention the myraids of game breaking bugs. RT performance is also not mentioned by Intel, implying a problem there as well.

I mean driver's should get better over time. It's a new product for Intel so there can be a lot of improvement in that regard.

As for the rest , its a first release and doesn't mean that newer products would be limited to low end performance. Vega had a lot of issues , RDNA fixed a bunch and then RDNA 2 got them back into the race with feature parity and performance parity at least in traditional rasterization and only falling behind in ray tracing. So perhaps it could be the same with Intel here. This product gets them on the board and gives them time to fix drivers for their next release which hopefully brings them close to feature and performance parity.

I rather have 3 players than 2 players is my personal opinion.
 
Social media has not been great for hardware reporting, everything is hyperbole. As long as its power consumption competitive with RTX3060 and the benchmarks are real, I really don't think this sets a bad tone. Obviously the delays and the way A380 was released shows some problems inside Intel but a prophecy of doom is a bit much.

They are almost certainly saving XeSS for the real (western) launch, got to keep some marketing powder dry. No need to paint that as a pre-emptive sign of weakness.
 
over 40 games is quite a bit more than just "select titles" and it included several games which haven't been properly optimized (according to Intel)
They specifically talked about how they mainly optimized the most important 100 games (popular, used by revieweres for benchmarking, etc). So yes 40 DX12 games are a drop in the ocean, and they ARE cherrypicked, so "select" is the right word.
RT performance has been benchmarked already,
RT performance has never been shown in any Intel marketing material, whether for low end or otherwise, why do you think so?
 
They specifically talked about how they mainly optimized the most important 100 games (popular, used by revieweres for benchmarking, etc). So yes 40 DX12 games are a drop in the ocean, and they ARE cherrypicked, so "select" is the right word.

RT performance has never been shown in any Intel marketing material, whether for low end or otherwise, why do you think so?

I think that is the smart thing to do when you are releasing a brand new product for your company. Steam did the same with steam deck. They started off with a hand full of fully compatible titles and slowly but surely added to them.

If these are intels best performing titles its only a problem if performance on other games aren't improved. If over time they gain parity or perform better than the cards they are compared to I don't see an issue.

I have said before this isn't a card that a serious gamer is going to go out and buy and play titles from 20 years ago and expect similar performance and features as amd and nvidia's latest and greatest. But I can see someone like my cousins 28 year old who last seriously played skyrim and stardew valley and wants Starfield and is looking to update her 6 year old pc. Depending on its performance and how it costs and how easily it is to get vs nvidia or amds similarly price card this could be a great deal for her.
 
Vega had a lot of issues , RDNA fixed a bunch and then RDNA 2 got them back into the race with feature parity and performance parity at least in traditional rasterization and only falling behind in ray tracing.
AMD had to do a complete mgmt overhaul at Radeon and they brought some Zen people in to get where they are now, and where they'll be by EOY.

Intel has no such luxury, their CPU roadmap is also on fire in places most undesirable.

Again, good Xeons are better than bad dGPUs.
Plain and simple.

I rather have 3 players than 2 players is my personal opinion.
Obviously but Intel has a huge bowl of problems to solve and they're all kinda more important than being a 3rd GPU player.
 
AMD had to do a complete mgmt overhaul at Radeon and they brought some Zen people in to get where they are now, and where they'll be by EOY.
Zen people did pretty much nothing except custom fit the caches to my understanding
 
Zen people did pretty much nothing except custom fit the caches to my understanding
They did a whole lot of methodology work and cultural brainwashing (i.e. the same stuff Intel needs to do everything to get back on track).
Mrs. Plummer was the hypnotoad there iirc.
 
Status
Not open for further replies.
Back
Top