Intel ARC GPUs, Xe Architecture for dGPUs [2018-2022]

Status
Not open for further replies.
What about those other 8 789 games released for PC this year so far? (Steam releases alone)
Really? you are going to play this card? count the trash, run of the mill games on steam that number in the thousands each year?

One knows when we talk about games in this forum, we mean AAA, or AA, graphically advanced or well made games, not games you can pretty much play on iGPUs with no problems, most of these thousands of games are either 2D games, side-scrollers or mobile games that nobody knows about. Most doesn't even have a specific minimum required GPU, or are only validated on NVIDIA GPUs.
 
One knows when we talk about games in this forum, we mean AAA, or AA, graphically advanced or well made games, not games you can pretty much play on iGPUs with no problems, most of these thousands of games are either 2D games, side-scrollers or mobile games that nobody knows about.
Speak for yourself. Many are interested in good games no matter the technology behind them.
 
This is a high end tech forum for graphics enthusiasts, not a forum for second hand cost effective hardware, discussing, pixel peeping and analyzing RT is what we do.
There are two aspects, as discussed by other posters. There's the technical investigations where money is no object, and then there's the 'business' side that is concerned with usability of tech based on market penetration. Both topics are valid discussion - people just need to differentiate between which topic they are debating and not step on each others' toes when someone is wanting to discuss a part of the overall GPU tech space.

If a thread is intended for tech and is getting swamped with market talk, or vice versa, please message the mods early and we can separate it into two discussions. But generally these discussions are a mixed bag thread and it's just better if people can accommodate the two debates side-by-side, marrying the tech with the commercial aspects, as there's inevitable cross over except for the most focussed tech analysis.
 
I honestly don't remember the last AAA game I played that had MSAA. I think Crysis 3 maybe? I know Forza has it but I've never tried it there.

One example I've seen is Control. MSAA is just another way to increase quality of the image.
 
Last edited:
Give me less blurriness, more detailled world, more drawing distance,etc before RT, please.
All are already achieved in one go with UE5, massive drawing distance, unlimited polygonal details, and hardware ray tracing for GI and reflections, bingo! It's not like RT is hindering our access to any of those features. It works on top of them.

Connecting all those dots, i really ask how anyone of you can seriously wonder why contra RT echo chambers form, or why people like me dare to doubt the new shiny god, which you declare to be the future.
It IS the future, what you are arguing isn't new, Hardware T&L had the same animosity back in the day, I was there, a major company was even against it "3dfx", for several years people argued against it too, citing increasing costs, complexity, and diminishing visual returns .. but guess what? Hardware T&L dominated, all those opposing it lost .. the 3dfx stubbornness cost It's life, NVIDIA alongside ATi survived and rose to the top because they were on the side of progress and technology, those who were not fell.

Same thing is repeated with ray tracing, however, 4 years since it's inception, all major studios, APIs, engines and games adopted ray tracing, NVIDIA continues to accelerate the performance of ray tracing further and further, Intel hopped on the bandwagon, and AMD has no choice but to do the same. You were here 4 years ago arguing HARD against hardware ray tracing, but hardware won in the end, it's even in UE5 now, so the industry is moving forward, both at a hardware level and software level, despite all the animosity and unfounded, uneducated opinions of the ignorant masses, the future is clear from where I am standing, I hope you and the others realize that, because if you don't, well it doesn't matter, we are here now and we are moving forward regardless.
 
It IS the future
It always was the future. As i try to plan ahead, my plans included compatibility with RT. And my approach uses soem RT too. Just make clear i'm not the concept.

But you can not compare the current situation with the rise of T&L, pixel and vertex shaders, finally compute.
Becasue such comparison ignores that back then, Moores Law was well alive and kicking.
Now not anymore, so progress slows down. And we have to adapt to this stagnation, if we want to keep business as usual. And business was pretty good during last gen.

So teh simple question is, as i have asked on day one: Was it the right time to introduce HW RT, or did it came too early?
Does it more damage or good to games industry?
Was it even well thought, or did it lack necessary flexibility to allow progress in a time of HW stagnation?

From my perspective, all the answers are pretty clear.


all major studios, APIs, engines and games adopted ray tracing
Yes, but this does not strengthen your position, because:
I can still play all those AAA games with my Vega56 very well. They don't leave me behind or force me to upgrade. Becasue they are well aware about the situation.
From all thos games of your long list, i only consider Exodus, Control, and CP2077 to take full advantage of RT. The rest has just reflections, or just shadows... nothing exciting.

So the truth is, for neither of us the situation really is that bad. We can be both staisfied with what's the current state, considerung current circumstances.
RT is here, but it is still the future as well. It's adopted, but no revolution did happen.

We should stick at that, and let it grow - slowly - if needed.
It takes as long as it needs to take.

NV misusing and glorifying RT to push larger and more costly GPUs hurts gaming, imo. And i hope they fall an their nose and this nonsense comes to an end. But that's just me and we will see.
 
All are already achieved in one go with UE5, massive drawing distance, unlimited polygonal details, and hardware ray tracing for GI and reflections, bingo! It's not like RT is hindering our access to any of those features. It works on top of them.
We only saw tech demos. UE5 is nowhere right now ? While RT is on a lots of games.
 
massive drawing distance, unlimited polygonal details
And you believe it?
No wonder you can't wait for whatever snake oil marketing divisions serve and promise.
However. If anything is achieved already, and you are fine with state of the art. Then no need for 4090, no? They could just deliver 20% more efficiency at the same cost and power draw, to respect and factor in Moores Law.
Just, maybe NV would miss a buck or two becasue enthusiasts no longer uprage every gen? Though, i doubt such reasonable strategy would give them less margins, not only at the long run.
'Psycho' settings - yeah, that's the proper term.
 
I'm actually more hopeful with seeing how DX9 game performance fares on Arc at this point than DX11, precisely because of the use of D3D9on12 which some took as a grave sign of how fucked Intel's drivers were, but I think that's absolutely the right approach for a new architecture imo. For one, the layer is by MS themselves - so it's not solely on Intel to remedy potential issues, MS has an interest in keeping it as compatible and high performance as possible too. Secondly, it's also open source now, the more eyes on the code the better.

For DX11, it will be interesting to see how effective DXVK might be for some of those problem games too. As Rich noted in his DF review, AC:Unity was an absolute shitshow on Arc. However, last night I tested it to see if it worked with DXVK - and it does, no glitches/artifacts whatsoever. The downside was GPU utilization - it was topping out at 75-80%, so definitely not at the native performance of DX11. This was not a CPU bottleneck though, as reducing the res to 1440p, keeping everything maxxed except just FXAA gave me a locked 60fps on my 3060. Certainly miles ahead of the performance DF got.

So we'll see with end user reports, if Red Dead Redemption & Doom Eternal results are any indication, Intel's Vulkan drivers may already be in good shape. I think that the use of translation layers to modern API's is a quicker path to remedy issues than hoping Intel gets a fully functional native DX11 driver.
 
I'm actually more hopeful with seeing how DX9 game performance fares on Arc at this point than DX11, precisely because of the use of D3D9on12 which some took as a grave sign of how fucked Intel's drivers were, but I think that's absolutely the right approach for a new architecture imo. For one, the layer is by MS themselves - so it's not solely on Intel to remedy potential issues, MS has an interest in keeping it as compatible and high performance as possible too. Secondly, it's also open source now, the more eyes on the code the better.

For DX11, it will be interesting to see how effective DXVK might be for some of those problem games too. As Rich noted in his DF review, AC:Unity was an absolute shitshow on Arc. However, last night I tested it to see if it worked with DXVK - and it does, no glitches/artifacts whatsoever. The downside was GPU utilization - it was topping out at 75-80%, so definitely not at the native performance of DX11. This was not a CPU bottleneck though, as reducing the res to 1440p, keeping everything maxxed except just FXAA gave me a locked 60fps on my 3060. Certainly miles ahead of the performance DF got.

So we'll see with end user reports, if Red Dead Redemption & Doom Eternal results are any indication, Intel's Vulkan drivers may already be in good shape. I think that the use of translation layers to modern API's is a quicker path to remedy issues than hoping Intel gets a fully functional native DX11 driver.
They sure have a long way to go even for DX9 performance though, even the most played title on Steam, and the #1 target that you'd do any of your DX9 testing/optimization on is still a dumpster fire at launch - from TechSpot's review:


Yikes. If they can't even get that to be halfway performant I don't know how much hope I have for other older titles.
 

Attachments

  • CSGO_1.png
    CSGO_1.png
    88.1 KB · Views: 16
They sure have a long way to go even for DX9 performance though, even the most played title on Steam, and the #1 target that you'd do any of your DX9 testing/optimization on is still a dumpster fire at launch - from TechSpot's review:


Yikes. If they can't even get that to be halfway performant I don't know how much hope I have for other older titles.
Or to think about it another way, you essentially have to target an iGPU from a couple years ago to get anywhere near 70fps 1% lows in CS:GO, or one of the saddest of the sad bottom end Polaris-era cards which launched at $79usd some 5 years ago.
 

Attachments

  • iGPU_CSGO.png
    iGPU_CSGO.png
    85.3 KB · Views: 15
So teh simple question is, as i have asked on day one: Was it the right time to introduce HW RT, or did it came too early?
Does it more damage or good to games industry?
Was it even well thought, or did it lack necessary flexibility to allow progress in a time of HW stagnation?

From my perspective, all the answers are pretty clear.

I agree. It’s crystal clear that if there was no RT today there wouldn’t be much of anything for us to talk about in here. The next revolution in shadow maps maybe?

It’s disingenuous to suggest that the real, tangible developments in RT and all of the great games that have benefited from that over the past 4 years should be discarded in favor of some nebulous, unrealized fantasy.
 
They sure have a long way to go even for DX9 performance though, even the most played title on Steam, and the #1 target that you'd do any of your DX9 testing/optimization on is still a dumpster fire at launch - from TechSpot's review:


Yikes. If they can't even get that to be halfway performant I don't know how much hope I have for other older titles.

Oh I forgot about that benchmark, yikes is right. Wonder how that would do under dxvk.
 
I agree. It’s crystal clear that if there was no RT today there wouldn’t be much of anything for us to talk about in here. The next revolution in shadow maps maybe?

It’s disingenuous to suggest that the real, tangible developments in RT and all of the great games that have benefited from that over the past 4 years should be discarded in favor of some nebulous, unrealized fantasy.
I would claim that UE5 has shown plenty to talk about, Crytek too. (assuming RT means HWRT here)
 
I would claim that UE5 has shown plenty to talk about, Crytek too. (assuming RT means HWRT here)

Yeah hardware RT. UE5 is exciting but Lumen/Nanite hasn’t shipped in a single game yet. The verdict is still out on software Lumen quality and performance. It certainly isn’t a reason to claim HWRT was launched too early. In fact UE5 happened despite the “wasted” RT transistors. Win, win for us.

Lumen: Software Ray Tracing is the only performant option in scenes with many overlapping instances, while Hardware Ray Tracing is the only way to achieve high quality mirror reflections on surfaces.
 
I agree. It’s crystal clear that if there was no RT today there wouldn’t be much of anything for us to talk about in here.
Thanks for confirming your only intersts are progress at any cost, even if it kills your own businesses.
It’s disingenuous to suggest that the real, tangible developments in RT and all of the great games that have benefited from that over the past 4 years should be discarded in favor of some nebulous, unrealized fantasy.
You call an affordable PC gaming platform a nebulous, unrealized fantasy?
Confirming the confirmation. Thanks.
 
Yeah hardware RT. UE5 is exciting but Lumen/Nanite hasn’t shipped in a single game yet. The verdict is still out on software Lumen quality and performance. It certainly isn’t a reason to claim HWRT was launched too early. In fact UE5 happened despite the “wasted” RT transistors. Win, win for us.
Moores Law is dead, so we simply have to wait a bit longer for progress. E.g. some years the first UE5 game. You can't wait that long and prefer Moon prices gamers can no longer afford, just to gaze on some reflections.

Your ignorance on UE5 proofing HW RTs issues surely is needed to keep your religion upright. Good luck with expecting a future of 300M dollar games made to serve a rich niche of 10 psycho enthusiast echo chamber nerds sharing the same mindest.

We would have got to that future 10 years earlier if HW RT had been gently but properly intruduced 5 years later. That's what i try to say.
 
Last edited:
Status
Not open for further replies.
Back
Top