Intel ARC GPUs, Xe Architecture for dGPUs [2018-2022]

Status
Not open for further replies.
Control was launched 3 1/2 years ago and has every raytracing feature: Reflections, GI, AO, shadows.
It had all those features with RTX off too.
GI is globally static in both paths.
Reflections are traced in both paths, but SW misses dynamic objects.
Shadows are higher quality with RT.
So that's high settings, but no new features.

UE5 in contrast shows new geometry detail which was not possible before, and RT can't support this new feature.

So that's pretty weak points for HW RT. But thanks for supporting my point.
 
We would have got to that future 10 years earlier if HW RT had been gently but properly intruduced 5 years later. That's what i try to say.
Conjectures and speculations, no thanks, I prefer the present.
Good luck with expecting a future of 300M dollar games made to serve a rich niche of 10 psycho enthusiast echo chamber nerds sharing the same mindest.
Yeah, if it comes to it, I prefer that future, rather than waiting forever for some lazy ass developer to spoon feed me extra AF or some pathetic extra draw distance on my PC game (this has been the case since forever). If you need budget gaming, a console is your friend.

Or you can also stop playing with double edged standards and buy a budget GPU to play with console settings if you are so keen on PC gaming that is.

As an enthusiast PC gamer, I am more excited than ever, now my PC game can have visual features leaps and bounds beyond what concole games offer, now my hardware gets to do some actual work.

UE5 in contrast shows new geometry detail which was not possible before, and RT can't support this new feature.
The Matrix demo on PC pretty much DESTORYED all of you conjectures right there, HW RT, great GI, reflections and area shadows. It's the perfect proof of concept that HW RT works and works well.
 
Last edited:
GI is globally static in both paths.
Reflections are traced in both paths, but SW misses dynamic objects.
Did you even play the game? GI is static yes, but AO is not, it covers more areas and adds many more shadows with RT, Reflections are expanded with RT to handle many more static objects that are omitted without RT, all dynamic objects are also included, transparency reflections are also present.

Shadows are higher quality with RT.
And many in numbers as well.

Please go play the game before spewing false second hand information.
 
Did you even play the game? GI is static yes, but AO is not
I did, yes. And sorry i did forget to arespond to AO. I've ingnored it becasue i never care about AO in general.
Fact is: If you still need AO, then your lighting solution is a very low frequency approximation or incomplete. So RT in this case can not provide a compelte lighting solution and depends on hacks still, just to mention.

Please go play the game before spewing false second hand information.
I have almost finished it.
Regarding RT, i depend on sources found here, e.g. DF videos.

Anyway, it's nitpicking. We get into details, and as soon somedody makes a claim, the other side can use the details against him.
I'm sorry for doing this on my side too. It's not ok to be an ass just becasue other people have different convictions.

My appologizes, guys. We need to find better ways to deal with the conflict...
 
It had all those features with RTX off too.
GI is globally static in both paths.
Reflections are traced in both paths, but SW misses dynamic objects.
Shadows are higher quality with RT.
So that's high settings, but no new features.

UE5 in contrast shows new geometry detail which was not possible before, and RT can't support this new feature.

So that's pretty weak points for HW RT. But thanks for supporting my point.
What Control did you play? Without Raytracing there is a ton of graphical effects missing - beginning of reflections everywhere, to transparent reflections, much better AO, contact shadows for everything and indirect diffuse lighting. These are all "new" features. Otherwise what "new" features is UE5 bringing to the market? More geometry and better GI is not really new, too...
 
What Control did you play? Without Raytracing there is a ton of graphical effects missing - beginning of reflections everywhere, to transparent reflections, much better AO, contact shadows for everything and indirect diffuse lighting. These are all "new" features. Otherwise what "new" features is UE5 bringing to the market? More geometry and better GI is not really new, too...
Why are you so hostile? The claim was we'd have nothing to talk about without HWRT. Counter-claim was that UE5 and Crytek have brought plenty to talk about even without it. It's not about which brings more to talk about.
 
Again, since you skipped it last time:
AMD already has both RT and ML acceleration. If they wanted to have their matrix cores in gaming GPUs, they would be there.
It would legitimately shock me if this wasn’t on the roadmap. It may not be ready say for RDNA3 etc; but it’s likely to arrive. I think most companies are coming to the same general conclusion that ML once you get beyond the fixed costs will be cheaper than brute forcing.
If you look on Twitter to see what people are actually saying it's nothing to do with Nvidia nor is it an insane reason.

People just don't think the visual difference justifies the massive performance drop.

It is and can be that simple of an explanation.
It’s a bit like 4K. You had to wait until people got used to using it and then take it away. RT represents the removal of an artificial ceiling that continues to plague our traditional T&L path.
On a longer time scale all games would be overwhelmingly RT due to costs and visual fidelity. Quite simply, there’s traditional methods will have hit their ceiling and RT is now taking off with it being it’s 3rd generation of hardware versus traditional rasterization path hardware that has had decades of evolution and stalling at compute shaders.

Gamers are welcome to push back at RT given they are still a luxury feature, but as the feature works it way into mainstream, as we see with consoles, as ML works it way to mainstream as we see with Arc and XeSS; the push back will no longer be there.

The performance loss is just going to be accepted part of it and any esport title will just ignore RT as usual unless it’s a fundamental part of the game experience or fairness which could happen one day.
 
Thanks for confirming your only interests are progress at any cost.
You call an affordable PC gaming platform a nebulous, unrealized fantasy?
You can't wait that long and prefer Moon prices gamers can no longer afford, just to gaze on some reflections.
Your ignorance on UE5 proofing HW RTs issues surely is needed to keep your religion upright.
Good luck with expecting a future of 300M dollar games made to serve a rich niche of 10 psycho enthusiast echo chamber nerds sharing the same mindest.

Just...wow. I genuinely have no idea what any of that has to do with hardware raytracing in GPUs.

We would have got to that future 10 years earlier if HW RT had been gently but properly introduced 5 years later. That's what i try to say.

Yes, you're saying that we would all enjoy some make believe better future if it wasn't for the premature launch of RT. Pure speculation and fantasy. If this alternate timeline is so glorious why did Nvidia, AMD, Intel, Microsoft and Sony all jump on the same bandwagon?
 
Intel Arc A770/A750 reviews


Finally caught up on Arc reviews. Not bad at all, looks like we have a viable 3rd player. I suspect Intel's deficiencies on DX11 won't hurt them that much short term and they'll muddle through until DX11 is no longer relevant in a few years.

Now I wait patiently for an HDMI 2.1 A380 to hit the US.
 
H/w RT is adding like less than 10% to GPU chip complexity while providing visual gains similar to those you'd get in 10 years time without h/w RT. I can't see how this is the reason for GPUs getting more expensive. If anything it's the reason they don't get a lot more expensive and instead are staying in more or less the same price range they've been for the last 10 years (yes GPUs below 4090 do exist and yes most people are actually buying them).
 
NV misusing and glorifying RT to push larger and more costly GPUs hurts gaming, imo. And i hope they fall an their nose and this nonsense comes to an end. But that's just me and we will see.

3060, 3060Ti, 3070, A750, A770 and even RX6600XT, 6700XT... they all sport ray tracing. So will 4050, 4060.... And they all stomp all over the consoles performance in both raster and RT/ML performance (the so called baseline).

You call an affordable PC gaming platform a nebulous, unrealized fantasy?

Again RTX 3060 is the most popular GPU right now according to Steam. That GPU is ballpark raster performance of consoles, with more capable RT and ML performance. Probably more generous framebuffer pool aswell. Thats a two year old Ampere GPU. 3060Ti, 3070 arent that expensive. RTX4060 will be offering 4080 capabilities in raster, probably more so in RT and ML.
The A770 16GB fits that category too. RDNA3 is probably having some nice options there aswell.
Its affordable enough if you dont dive into high end and enthusiast range of gpus.

Yeah hardware RT. UE5 is exciting but Lumen/Nanite hasn’t shipped in a single game yet. The verdict is still out on software Lumen quality and performance. It certainly isn’t a reason to claim HWRT was launched too early. In fact UE5 happened despite the “wasted” RT transistors. Win, win for us.

UE5 of what we have seen so far on consoles its limited to barely 30fps and thats tech demos. Its too heavy for actual games, and if we want 60fps then that fidelity is going to drop hard.
Without dedicated acceleration things are going to be too slow, quality takes a hit too.

Moores Law is dead, so we simply have to wait a bit longer for progress. E.g. some years the first UE5 game. You can't wait that long and prefer Moon prices gamers can no longer afford, just to gaze on some reflections.

It aint dead. We still see nice pure raster improvements per watt, rather large improvements i'd say looking at Ampere vs Lovelace. 'Waiting longer' thats not how it works in the technology world, you have to start somewhere. If we'd be skipping pixel and vertex shaders etc in the early 2000's and start with it five years later instead then things would not advance in the same way, we'd be lagging behind the facts.
You cant say 'ah now RT is fast enough' and implement it just like that in one day. These things evolve slowly, things get faster and faster, devs get used to the new tech and tools. Anyway whats the complaint, maxed RT games are quite playable, often achieving 30fps at the least. Thats without the evil DLSS, XeSS techs. 25 years ago games could run WAY below 30fps to achieve some extra fidelity.
No idea where you get moon prices from again. A 3060, A770 offer more than what the consoles do, and their quite affordable GPU's. 4060 wont be seeing 4090 or 4080 prices thats for sure. Most pc gamers arent on high-end hardware. Though things evolve rapidly, two years in and that bar rises again. PC is dynamic platform, low end is going to surpass the baseline etc etc.

Your ignorance on UE5 proofing HW RTs issues surely is needed to keep your religion upright. Good luck with expecting a future of 300M dollar games made to serve a rich niche of 10 psycho enthusiast echo chamber nerds sharing the same mindest.

We would have got to that future 10 years earlier if HW RT had been gently but properly intruduced 5 years later. That's what i try to say.

If anything UE5 is evident that HWRT is needed to obtain viable performance levels on todays hardware, in special consoles who have weak hardware (lower mid range 2020 gpus, Ok-ish cpus, no RT/ML hw in the sense that everyone else does, low bandwith). Going brute force/using compute units to do RT is just not viable if we want performance that was predicted. Remember also how heavy UE5 is on the CPU, something like DLSS3 is a life saver there.

Yeah, if it comes to it, I prefer that future, rather than waiting forever for some lazy ass developer to spoon feed me extra AF or some pathetic extra draw distance on my PC game (this has been the case since forever). If you need budget gaming, a console is your friend.

Or you can also stop playing with double edged standards and buy a budget GPU to play with console settings if you are so keen on PC gaming that is.

Thats why he prefers the console platform, and we have to respect his choice. So he has others choices.
On the consoles RT is practically non-existant in modern current gen games and engines, bar some simple reflections (sometimes even checkerboarded, meh). Improvements will be soley in normal raster, higher settings etc.
And yeah, you can match that on PC with RX6600XT or even 5700XT if on budget. Or get a 3060. Thats the nice thing about pc gaming, you have the choice. All PS exclusives come to PC anyway.

The Matrix demo on PC pretty much DESTORYED all of you conjectures right there, HW RT, great GI, reflections and area shadows. It's the perfect proof of concept that HW RT works and works well.

Pretty much. UE5 practically shows at a glance that HWRT is needed. Aswell as monster CPUs if we want beyond 30fps in tech demos. Theres a good reason Intel works so hard on a DLSS3 equelivant aswell as dedicated RT acceleration for both raster performance (dont put load on compute units) aswell as performance (dedicated hw is faster).

It would legitimately shock me if this wasn’t on the roadmap. It may not be ready say for RDNA3 etc; but it’s likely to arrive. I think most companies are coming to the same general conclusion that ML once you get beyond the fixed costs will be cheaper than brute forcing.

Theres rumors RDNA3 will sport HW ML acceleration. DF is under the assumption that dedicated RT hardware will follow too for AMD in the future. Intel was just faster than expected.

Just...wow. I genuinely have no idea what any of that has to do with hardware raytracing in GPUs.

That with 'Your ignorance on UE5 proofing HW RTs issues surely is needed to keep your religion upright.' Caught my attention, its just not needed.
 
NV misusing and glorifying RT to push larger and more costly GPUs hurts gaming, imo. And i hope they fall an their nose and this nonsense comes to an end. But that's just me and we will see.
Such a strange post... considering this thread is about a GPU architecture that is bringing very good RT performance to a very affordable price range... Just because the highest end keeps getting more pricey, doesn't mean affordable RT performance isn't also there.

I for one am very happy that Nvidia has pushed the entire industry down this road, and that new competitors like Intel are also pushing forward with it.
 
What Control did you play? Without Raytracing there is a ton of graphical effects missing - beginning of reflections everywhere, to transparent reflections, much better AO, contact shadows for everything and indirect diffuse lighting. These are all "new" features. Otherwise what "new" features is UE5 bringing to the market? More geometry and better GI is not really new, too...
Seems very subjective then.
To me, the RT extras of Control are nice but still subtle. No reason to upgrade, speaking as a gamer. (This applies to all RT games. Exudus cinvinces me the very most.)

But the UE5 demo blew me away. The most impressive upgrade i have seen since Doom 3. It is something new, and i would not have expected this is possible. Speaking of detail, not lighting.

But tbh, i have serious concerns about UE5 being practical too, after running the Matrix demo on my PC.
I'm just the extreme opposite from many of you here: I expect that parctical HW power might even go down, because people get tired about high costs and huge boxes. Switch, Steam Box, Series-S. This represents current gaming interests much better than bleeding edge GPUs which require a 1200W PSU and a bigger case as well, i assume.
Currently, Arc feels liek a saviour, keeping our PC platform afloat as is for some time. But i still think it will change towards APU over dGPU. I think ithis progress is unavoidable, and it was not easy for me to accept this. I'm an gfx enthusiast too.

When Absolute Biginner told us first about Series-S here, my first thought was: Meh - another weak console to hold us back. Please don't build this.
But i quickly realized it is a good idea within few days. Affordable, good enough, easy entry, sustained business.
So i befriended the idea that scaling down is more important than scaling up.
And it turned out right, because Series-S was always available at good price, even at peak of covid and mining crisis, afaict.

Once you arrive at this mindset, latest NV presentation of 4090 feels like a treat, putting this reasonable low power future in danger. Unaffordable bleeding edge tech is sold in the same segment as mainstream, lifting peoples expectations to unreachable levels, which will result in increasing dissatisfaction and envy across gaming. In a time of recession and growing critique / disrespect about games, that's the last thing we need. GPU makers NV - and probably AMD as well - lost their mind and work against us. That's my impression, and it won't change until they get down to our power limited planet earth again.

Maybe they offer a reasonable midrange or even entry level next year, now after Intel proofing that's still possible. Late if so, and i doubt it, but still.
Maybe my worries are exaggerated, it wont be as bad as i think, and so i should lower my temper.
But maybe you should lower your expectations on offline CGI at realtime as well.

To settle the issue, i would propose some things:
HW vendors should differentiate mainstream and enthusiast products. Differnt names, and making clear the latter is more reasonable for content reotars than gamers. Launching mainstream first would be nice.
Journalists should do the same. A 2000$ GPU is not gaming HW, and this should be made clear very frequently. Don't present this as 'must have' or 'best in gaming class'. Associate to your audience, not to those sending you review samples.

In an ideal world, we would not need such borders, but our world isn't ideal. And confusion, distraction and misinformation is an increasing issue.

1665238135088.png
It's a meme to normal people. RT may be the future, but this is not.
 
While I do think HW-RT is very important and I am glad Intel is fully invested in this, I must say I suspect UE5 will soften the blow on its importance in the long-run quite significantly, as it will be the go to engine for most developers. UE5 just has a very robust software implementation with their signed distance fields that looks very close to HW-RT in most instances. Of course, there are some big differences if you look directly at reflections or scenes with indirect lighting (you can refer to Alex' video on the Matrix Demo for more details) and I've tested the difference between HW and SW Lumen extensively myself, but all in all, SW-Lumen looks very close and performs similar. HW-Lumen also runs very similar in GPU limit on both RDNA2 and Ampere GPUs even though Ampere has much more RT grunt, so I suspect in UE5 games the superiority in RT performance of Intel ARC and Nvidia GPUs won't be that noticeable.
 
Last edited:
Seems very subjective then.
To me, the RT extras of Control are nice but still subtle. No reason to upgrade, speaking as a gamer. (This applies to all RT games. Exudus cinvinces me the very most.)

That wasnt DF's take on Control's ray tracing. To the contrary many seem to appreciate RT in that title. If that is subtle, then what about PS5 tiles who have the most simplistic RT one can imagine?

But the UE5 demo blew me away. The most impressive upgrade i have seen since Doom 3. It is something new, and i would not have expected this is possible. Speaking of detail, not lighting.

Its a technology demonstrator, its not a final game like Doom 3 was. Besides that, UE5 performances are higher on GPUs that sport HWRT.

I'm just the extreme opposite from many of you here: I expect that parctical HW power might even go down, because people get tired about high costs and huge boxes. Switch, Steam Box, Series-S. This represents current gaming interests much better than bleeding edge GPUs which require a 1200W PSU and a bigger case as well, i assume.
Currently, Arc feels liek a saviour, keeping our PC platform afloat as is for some time. But i still think it will change towards APU over dGPU. I think ithis progress is unavoidable, and it was not easy for me to accept this. I'm an gfx enthusiast too.

Tired about high prices and huge boxes describe the PS5 aswell in the console space lol. 3060 aint a 1200W gpu, doesnt cost over a grand and isnt a huge box either. A 6800, A770, 3070 etc blow past the consoles, and their quite intresting seeing they are in quite many hands.
I understand the view from a console enthusiast, however the view of 'APU's taking over', the death of pc gaming eventually etc all are in your mind. DF Direct was on about this with APUs and the conclusion was it sure wasnt the future for desktops. I wouldnt even want it, hell no.
Its not easy to accept the current situation, its hard. But youl get over it for sure. Its no more 6th gen-like with the consoles.

Once you arrive at this mindset, latest NV presentation of 4090 feels like a treat, putting this reasonable low power future in danger. Unaffordable bleeding edge tech is sold in the same segment as mainstream, lifting peoples expectations to unreachable levels, which will result in increasing dissatisfaction and envy across gaming. In a time of recession and growing critique / disrespect about games, that's the last thing we need. GPU makers NV - and probably AMD as well - lost their mind and work against us. That's my impression, and it won't change until they get down to our power limited planet earth again.

Thats where the Playstation is going looking at current things, it cant be much better going forwards. MS noted that new generation of consoles is challanged due to increasing cost of chips/die cost. Things like GP and other such services will survive consoles, their only purpose is gaming afterall. PC's will exist longer thats for sure, and aslong it exists there will be games for it.

Maybe they offer a reasonable midrange or even entry level next year, now after Intel proofing that's still possible. Late if so, and i doubt it, but still.
Maybe my worries are exaggerated, it wont be as bad as i think, and so i should lower my temper.
But maybe you should lower your expectations on offline CGI at realtime as well.

There are mid range offerings avialebale if you want. You dont need halo products.


A 2000$ GPU is not gaming HW

It is for those that have the money and will for it. Though its not like you need such gpu. Your so far beyond the baseline its not even funny, a 3060Ti will do, so will A770 or 6600XT for your mainstream user.

It's a meme to normal people. RT may be the future, but this is not.

Its a meme for console warriors in a system wars topic, not here in the Intel Arc GPU/RT discussion. You got the chance that someone will counter react with a meme thats against the PS5, they sure do exist i can tell you that. Meme doesnt even make sense, the PS5 is much larger and the card doesnt fit in the '1200w GPU requirement'. Its reported.
 
It would legitimately shock me if this wasn’t on the roadmap. It may not be ready say for RDNA3 etc; but it’s likely to arrive. I think most companies are coming to the same general conclusion that ML once you get beyond the fixed costs will be cheaper than brute forcing.
Maybe they will, maybe not, but for now AMD has seen them worth it only for CDNA. The units themselves should have been integratable to RDNA2 already had they so wanted (since CDNA2 was finalized with them long before the IP for the blocks must have been finalized long before that)
It reminds me more of when HDR lighting first made it's debut, huge performance drop, over the top brightness and people saying it's not worth it.
Which time of the several times HDR lighting has been introduced "for the first time" are you talking about in this case? Half Life Lost Coast, which many account as first game with HDR lighting, had notable performance drop from it but pretty sure it wasn't in same scale as RT performance loss is today (which of course again differs from game and card to the next)
 
Such a strange post... considering this thread is about a GPU architecture that is bringing very good RT performance to a very affordable price range... Just because the highest end keeps getting more pricey, doesn't mean affordable RT performance isn't also there.
Agreed. Idk why the discussion slipped to just RT again.
Personally i accuse NV to misuse NV to push overspecced data center ML HW to gamers, but that's subjective.
The (off)topic came up because people noticed contra RT echo chambers on the internet. From my perspective, RT is just the scapegoat to explain / justify overspecced and overpriced 'gaming' HW.
Seems some others share this view. Obviously it comes from observing NV marketing and pricing.

Intels RT seems fine to me. Although they follow the same path of fixed function + ML, their marketing and pricing positions those features mauch better to the perspective of gamers.
AMDs RT is also fine. They introduced it gently, and full flexibility would have allowed further progress together with software devs. But unfortunately this door was already closed becasue NV was first.

It aint dead.
Jensen says PAth Tracing is reasonable, so it is reasonable.
Jensen says Moors Law is dead, so it is dead. Thanks, Ms. Jensen. I've also said some things i regret, but i'm no CEO of a chip design company, lol.

Fact is: If it's not dead, then only becasue it's definition does not contain price or power draw. If we factor this in, it's dead.
We can still get progress from better architecture, better software. Smaller processes as well, but if price has to remain constant, it just does not give us much anymore. And this means things change, just not to our liking.

If anything UE5 is evident that HWRT is needed to obtain viable performance levels on todays hardware, in special consoles who have weak hardware
I just disagree and assume that's assumptions based on second hands knowledge, but i won't get into detail here. Already spilled enough offtopic stuff. UE5 lighting is too much patchwork for general conclusions anyway, but together with CryTeks SW RT, all bases were covered. Progress would have happened no matter what HW industry contributes, it's just that people always wrongly assume all gfx progress came with new HW / features.
 
Status
Not open for further replies.
Back
Top