Intel ARC GPUs, Xe Architecture for dGPUs [2018-2022]

Status
Not open for further replies.
Yeah, that's gotta be it. It's not that it's a tenuous economic climate for many in the midst of 'midrange' GPU's now costing ~$500-$700 instead of $200-$300 in years past.

It's that RT is 'being cancelled' by the 'less educated' masses. jfc🙄


A 3060 aint that expensive and is the mainstream/most popular GPU according to Steam. A770/3060 are not far from 300usd price range, just like the same performance class 'in the past'. Its that the cap/roof is much higher now.
 
Guys on gfx enhusiast forum realize there is an outrage against costly RT.
Baffled.
Then posting comparison screenshots and discussing which pixel looks better at what image.

Cute ;D
 
Guys on gfx enhusiast forum realize there is an outrage against costly RT.
Yeah, RT is costly, it runs on 2060/3050/A750 and even potato level hardware like the RX 6400/A380. It's the same reptitive BS logic for 4 years straight, this is getting tiresome.

This is a high end tech forum for graphics enthusiasts, not a forum for second hand cost effective hardware, discussing, pixel peeping and analyzing RT is what we do.
 
Last edited:
This is a high end tech forum for graphics enthusiasts, not a forum for second hand cost effective hardware, discussing, pixel peeping and analyzing RT is what we do.

Enabled by products. The effectiveness of that technology is determined in no small part as to what price points its actually achievable for the general public. Nobody would give a shit about any of this stuff if it wasn't for enhancing games, and for RT to literally be a game-changer you need this technology to penetrate a sizeable enough userbase for developers to start designing for it from the outset instead of tacking it on.

All technology is only as meaningful as the people it can ultimately impact. Discussing/pixel peeping/analyzing RT is also what people are doing when they don't feel the benefit of it in most games warrants the cost in terms of outright price of the hardware necessary, and/or the sacrifices to performance/resolution they have to make for their budget.

It may have gotten out of hand, but it's understandable for discussion about the relative worth of certain technologies to outgrow from a thread about a new product line that has had its price advantage serve as a key marketing tactic. Intel's approach is no doubt born out of necessity, but they apparently feel that speaking to how costly this technology has gotten to many potential gamers is an effective approach for some reason.
 
Last edited:
I truly hope that the b3d mod team will seperate console and pc discussions since it is not cute anymore.
How do you mean this? I didn't notice a console vs. PC war?

Don't say you think console gamers are envy about advanced RT on PC, which explains the outrage?

It's not that i think. Enhusiast PC was always ahead. More FPS, higher res, high settings.

What's new is the massive cost increase, and it started with Turing, introducing RT and ML. Many can't afford, and are afraid somebody takes their gaming away from them sooner or later.
Then there is a general anxiety about AI. AI takes our jobs. AI analyzes our data, so companies and theri servers know more about our lifes than we can remember ourselfs, and they don't ask about our agreement to do this.
Some people are afraid AI driven cars will kill their kids on the street.
People are also tired about pointless tech hypes in general.
That's quite some reasons to explain why NV takes the largest hit.

Yeah, RT is costly as it runs on 2060/3050/A750 and even potato RX 6400/A380. It's the same BS logic for 4 years straight, this is getting tiresome.
It's tiresome, but it won't go away. We have to deal with it, no matter which side we take.
At least it's fun, no? We respect each other, which is what matters. But how often do you have a smile on your face when you post something tickling the 'enemy'?
I smile a lot. It's a bit lonely to work alone for years. But this forum really is fun and i feel like there are some other nerds with the same stupid interests as i have.

This is a high end tech forum for graphics enthusiasts, not a forum for second hand cost effective hardware, discussing, pixel peeping and analyzing RT is what we do.
For whom?
It's pointless if people can no longer afford it.
Either we care about cost efficiency, and enable awesoem gfx on that HW, or game over.
Praise of path traced demos on data center HW does not help anybody. It only increase the envy and feeds the fire.
It's about games, not gfx. Caesar said 'bread and games'. Everybody should have both. It was never meant for a rich elite and won't work for that. Making games is too expensive.
 
How do you mean this? I didn't notice a console vs. PC war?

Don't say you think console gamers are envy about advanced RT on PC, which explains the outrage?

It's not that i think. Enhusiast PC was always ahead. More FPS, higher res, high settings.

What's new is the massive cost increase, and it started with Turing, introducing RT and ML. Many can't afford, and are afraid somebody takes their gaming away from them sooner or later.
Then there is a general anxiety about AI. AI takes our jobs. AI analyzes our data, so companies and theri servers know more about our lifes than we can remember ourselfs, and they don't ask about our agreement to do this.
Some people are afraid AI driven cars will kill their kids on the street.
People are also tired about pointless tech hypes in general.
That's quite some reasons to explain why NV takes the largest hit.

I just think personally that its better to have console people discussing in the console sections and PC people in the pc sections.... its just never going to end up in healthy discussions otherwise.

Enthusiast would be 3080Ti and beyond for Ampere perhaps. You do not need such gpu products to enjoy ray tracing. A 4060 would be RTX3080 range somewhere, which is going to be 2020 high-end or close to enthusiast level of hardware.
DLSS2 and in special 3 enable gamers to achieve even higher settings and fidelity. Its like some all some are seeing is that 4080 and 4090 (highend/enthusiast for Lovelace). its not just about higher settings anymore but also the addition of features such as ray tracing and ML.

If people are so afraid they shouldnt be looking at high end or enthusiast products. A 3060 is going to give you console level performance with the addition of ML and RT performance that outmatches them, its atm the most popular GPU. A 3060Ti or even 3070 is quite much faster than that. We shouldnt forget that a 4080 12g (which some call for a 4070) is actually performing like a 3090Ti with the added bonus of better RT and ML performance (DLSS3 etc).
The average PC gamer isnt going to be buying 4090 gpus, no that will be 4060Ti/4070 perhaps.

It's tiresome, but it won't go away. We have to deal with it, no matter which side we take.
At least it's fun, no? We respect each other, which is what matters. But how often do you have a smile on your face when you post something tickling the 'enemy'?
I smile a lot. It's a bit lonely to work alone for years. But this forum really is fun and i feel like there are some other nerds with the same stupid interests as i have.

Its fun aslong people dont attack eachother personally, which i must say hasnt happened all to much in this discussion so thats good :p

For whom?
It's pointless if people can no longer afford it.
Either we care about cost efficiency, and enable awesoem gfx on that HW, or game over.
Praise of path traced demos on data center HW does not help anybody. It only increase the envy and feeds the fire.
It's about games, not gfx. Caesar said 'bread and games'. Everybody should have both. It was never meant for a rich elite and won't work for that. Making games is too expensive.

You have GPU's for around 300 to 350usd that are enough for most gamers. Your blindsided by all these 3090 and 4080/90 gpus it seems. You dont need data center HW (wtf lol), when a 3060Ti is doing very, very well. Things scale up from there. The death of the platform claim has been sung since the 80's. Its as old as the hills.

Anyway, why talk about all this anyway? I do not think its the topic for it, create a topic around 'the death of pc gaming' and you will receive alot of attention im sure. Thats my main complaint, a GPU is released, be it Intel or NV and things spin off into things like this. Intels GPU's are just that, quite affordable for reasonable performance, and then we talk about NV prices being too high in the Intel Arc topic :p
Otherwise i do respect you as a forum member and you never attack personally, and seem to have some good technological knowledge aswell. I think that everyone just should go along better but ok. Things have been unhealthy since the release of atleast Turing with DLSS and RT, i think.
 
Last edited:
I just think personally that its better to have console people discussing in the console sections and PC people in the pc sections.... its just never going to end up in healthy discussions otherwise.

Enthusiast would be 3080Ti and beyond for Ampere perhaps. You do not need such gpu products to enjoy ray tracing. A 4060 would be RTX3080 range somewhere, which is going to be 2020 high-end or close to enthusiast level of hardware.
DLSS2 and in special 3 enable gamers to achieve even higher settings and fidelity. Its like some all some are seeing is that 4080 and 4090 (highend/enthusiast for Lovelace). its not just about higher settings anymore but also the addition of features such as ray tracing and ML.

If people are so afraid they shouldnt be looking at high end or enthusiast products. A 3060 is going to give you console level performance with the addition of ML and RT performance that outmatches them, its atm the most popular GPU. A 3060Ti or even 3070 is quite much faster than that. We shouldnt forget that a 4080 12g (which some call for a 4070) is actually performing like a 3090Ti with the added bonus of better RT and ML performance (DLSS3 etc).
The average PC gamer isnt going to be buying 4090 gpus, no that will be 4060Ti/4070 perhaps.



Its fun aslong people dont attack eachother personally, which i must say hasnt happened all to much in this discussion so thats good :p



You have GPU's for around 300 to 350usd that are enough for most gamers. Your blindsided by all these 3090 and 4080/90 gpus it seems. You dont need data center HW (wtf lol), when a 3060Ti is doing very, very well. Things scale up from there. The death of the platform claim has been sung since the 80's. Its as old as the hills.

Anyway, why talk about all this anyway? I do not think its the topic for it, create a topic around 'the death of pc gaming' and you will receive alot of attention im sure. Thats my main complaint, a GPU is released, be it Intel or NV and things spin off into things like this. Intels GPU's are just that, quite affordable for reasonable performance, and then we talk about NV prices being too high in the Intel Arc topic :p
Otherwise i do respect you as a forum member and you never attack personally, and seem to have some good technological knowledge aswell. I think that everyone just should go along better but ok. Things have been unhealthy since the release of atleast Turing with DLSS and RT, i think.
Dude... More than 15 mentions of Nvidia products in this post. One mention of Arc.
 
This is a high end tech forum for graphics enthusiasts, not a forum for second hand cost effective hardware
Ouch. I'm so sorry, I'll leave. I don't even have resizable BAR in my antique, I'm unworthy of commenting on this sort of thing. :(

For what it's worth I wish I did so I could play around with the new ARC. Even if it's buggy it's new and different and I'd have fun getting annoyed by it.
 
Ouch. I'm so sorry, I'll leave. I don't even have resizable BAR in my antique, I'm unworthy of commenting on this sort of thing. :(

For what it's worth I wish I did so I could play around with the new ARC. Even if it's buggy it's new and different and I'd have fun getting annoyed by it.
really, your pain is felt. Having a new Intel GPU has to be fun -sometimes frustrating too, maybe, but that's what makes it interesting, seeing how it improves over time-.

As for Resizable BAR, maybe you can have that feature if you update the BIOS but I shall not hold my breath.

On another note, it's kinda puzzling how is that the power consumption of the A750 is the same as both A770 models.

 
A 4060 would be RTX3080 range somewhere, which is going to be 2020 high-end or close to enthusiast level of hardware.
It's expected that next gen is faster than prev gen. But it is new that the price and now goes up linear with performance. Power draw goes up too the same way, so you pay even more.
Price of electricity, fuel and food goes up too. So people aim to pay less for their hobby, not more.

NV ignores this completely, Intel reacts as desired, but people are quite unsure about the risk to invest into a newcomer.
Which means ther is no really good offer to many of them. After years of chip shortage and moon prices, still no offer.

That's the situation. And it really is easy to understand why people are disappointed and feel abandoned from an industry they have supported with their money over many years.
Regarding RT and ML, i feel like the resistance has settled. Initially many disliked it, but to me it seems those have changed their mind.
I don't think it's about technology, and if people think it's needed or not. It really is just at the pricing.

The arguments that i read here are inconsistent and bended into form as needed.
If i say RT is too expensive, i hear i can get 3060 for a good, small price. (The same price i've spend on x80 models some years before.)
The same people claim that a 6700 is 'too slow' for proper raytracing, consoles can barely do it at all, etc.
Tech journalists think RT is not relevant at midrange level of a A770. (quite a statement!)

That's confusing, but the final conclusion is that RT is a high end feature, and mid range is not good enough, while entry level is not even a topic still.

At the same time, Jensen himself confrims GPU prices won't go down anymore. So they will only go up even further. Besides trumpeting newest ML and RT features for those who can pay premium.

Connecting all those dots, i really ask how anyone of you can seriously wonder why contra RT echo chambers form, or why people like me dare to doubt the new shiny god, which you declare to be the future.
Becasue if this is the only future, then mainstream gaming is literally declared to die out rather quickly. Am i wrong? Does not matter - because that's what many people think and experience.

Now you may respond with 'RT is optional. You do not need to have it.'
But at the same time you request and predict that games should and will be made, finally, entirly on top of RT, and the dark past of affordable gaming should be burried asap.
Which means: Some day soon i will have to have it, but then it will be even less affordable. So i have to quit, even if i don't want to.

And that's not all.
We have the same problems about games themselfs for years, and it's growing. They go up from 60 to 70 to 80 bucks. While people are disappointed more and more about the content. Always the same recipes and franchises, releasd years before finished, just remakes from a better past, requiring always online for a single player game, MTX and loot boxes, etc., you name it. Political things aside.

It feels like the gaming industry is at an all time low of creativity, and devs do not enjoy to make them. They have to crunch, get lowest payment, suffer from harassement at work, management only cares about money, etc...

That's the current public impression as we hear. And no see folds are big enough to ignore this.

I think we have to figure out what people want, not what we want. We have to make better offers.
Regarding HW, Intel shows the way. It's the first time RT feels 'in reach' for real.
 
So general takeaways for A750 and A770:
  • Raster performance vs price it's generally decent in Dx12, Dx11 is a bit worse, with <=Dx10 games or if you don't have ReBAR get something else
  • Raster vs die size and paper specs it's significantly underperforming as it's larger than GA104 and significantly larger than N22 (406mm2 vs 393mm2 vs 335mm2 respectively, 128 ROPs!) and in raster gets outperformed by them
  • RT is good, at worst it's not far off Ampere being harsh, far better than RDNA2 and often goes from GA106/N23 raster to GA104 levels of performance with heavy RT, significantly beating N22 in heavier RT titles
  • Scales much better with resolution, utilisation issues
  • Other features like XMX units have very good specs, encoding is good, XeSS preliminarily looks good although more performance and quality data is needed
Lots of potential drivers issues that need to be resolved and maybe significant performance improvements over the next few years, on top of being far later than they would've liked with competitor's next gens soon. With N33 we're expecting a good improvement, GA104/N22 level raster with much improved RT vs RDNA2 and half the die size of A770 (smaller than N23) but they'll be licking their lips at potential margins I suspect. For Nvidia who knows how much the 4080 6GB will be. Overall a mixed bag with the alchemical pipecleaner but that's expected, it's a huge undertaking getting into a new segment vs well established players and they won't get everything right straight out the gate. Hopefully they make big strides improving significantly on these foundations and we have 3 big competitors in 2025
 
Last edited:
It's expected that next gen is faster than prev gen. But it is new that the price and now goes up linear with performance. Power draw goes up too the same way, so you pay even more.
Price of electricity, fuel and food goes up too. So people aim to pay less for their hobby, not more.

Thats problems regardless of IHV or platform, even consoles. Anyway not the topic for all this.
 
So general takeaways for A750 and A770:
  • Raster performance vs price it's generally decent in Dx12, Dx11 is a bit worse, with <=Dx10 games or if you don't have ReBAR get something else
  • Raster vs die size and paper specs it's significantly underperforming as it's larger than GA104 and significantly larger than N22 (406mm2 vs 393mm2 vs 335mm2 respectively, 128 ROPs!) and in raster gets outperformed by them
  • RT is good, at worst it's not far off Ampere being harsh, far better than RDNA2 and often goes from GA106/N23 raster to GA104 levels of performance with heavy RT, significantly beating N22 in heavier RT titles
  • Scales much better with resolution, utilisation issues
  • Other features like XMX units have very good specs, encoding is good, XeSS preliminarily looks good although more performance and quality data is needed
Lots of potential drivers issues that need to be resolved and maybe significant performance improvements over the next few years, on top of being far later than they would've liked with competitor's next gens soon. With N33 we're expecting a good improvement, GA104/N22 level raster with much improved RT vs RDNA2 and half the die size of A770 (smaller than N23) but they'll be licking their lips at potential margins I suspect. For Nvidia who knows how much the 4080 6GB will be. Overall a mixed bag with the alchemical pipecleaner but that's expected, it's a huge undertaking getting into a new segment vs well established players and they won't get everything right straight out the gate. Hopefully they make big strides improving significantly on these foundations and we have 3 big competitors in 2025

If it didn't use so much power for it's performance level, especially when idle, I'd be interested in it for a HTPC. I do wonder how good and fast the encoding engine is and whether or not the GPU would be capable of significantly accelerating interpolation to transcode videos from 24/30 FPS to anywhere from 60 to 120 FPS.

One of the things that makes me really want to upgrade from the 1070 is to get a GPU that is more capable of assisting with interpolating videos to higher framerates so they don't look like arse (24/30 FPS video, yuck).

Regardless, at the price they are selling it at, if it proves to be really good with transcoding (interpolation) to a higher framerate for video, I might be tempted to build a specialized PC with an ARC GPU just for those duties.

Regards,
SB
 
I don't expect this to be true before 2026.
Let's see, PC games of 2022:

Elden Ring, announced to have RT
Dying Light 2: Has RT
Gotham Knight: has RT
Ghostwire Tokyo: has RT
Saints Row: has RT
Callisto Protocol: has RT
F1 2022: has RT
Warhammer 40K: Darktide: has RT
A Plague Tale Requeim: has RT
Portal RTX: has RT
Marvel's Spider-Man Remastered: has RT
Marvel's Spider-Man: Miles Morales: has RT
Sackboy: A Big Adventure: has RT
Trail Out: has RT
Q.U.B.E. 10th Anniversary: has RT
Loopmancer: has RT
The Diofield Chronicle: has RT
Steelrising: has RT
Blind Fate Edo no Yami: has RT
Stray: has RT when running DX12

Modern Warfare 2: no RT
Overwatch 2: no RT
Sifu: no RT
Total War Warhammer 3: no RT
Rainbow Six: Extraction: no RT
Tiny Tena's Wonderlands: no RT
Marvel's Midnight Suns: no RT
Lost Judgment: no RT
Judgment: no RT
Scorn: no RT
Sniper Elite 5: no RT
Uncharted 4: no RT
God Of War: no RT
Shadow Warrior 3: no RT

I rest my case.
 
Last edited:
Status
Not open for further replies.
Back
Top