Intel ARC GPUs, Xe Architecture for dGPUs [2018-2022]

Status
Not open for further replies.
The difference between DLSS presets is fairly minor. If you're distracted by upscaling artifacts at DLSS Performance you will most likely see the same artifacts in Balanced and Quality. The only tier which is always noticeably worse is Ultra Performance.
That's my point. The artifacts get even worse in lower modes.
You don't have to use DLSS in games which use RT lightly as they tend to not hit performance much either. Generally DLSS+RT is always a better choice from image quality pov than native without RT.
I agree that RT+DLSS Quality is better, especially in games with awful, blurry TAA like Cyberpunk. RT+DLSS Balanced I can live with. Anything below that and I rather drop DLSS in favor of native res. Bonus points if TAA isn't built-in or I can disable it. It's beyond me why this method has been favored over other AA methods when it's FXAA all over again with the blur. I get that MSAA and SSAA are punishing on performance but they do wonder for jaggies and fine details. TAA usually smears your screen with vaseline so I prefer DLSS Quality over it a lot of the times.
 
Its 17 to 20TF.
Tried to find some compute benches, because i expect games still suffer a lot from early drivers, while compute is much simpler.
But could not find any, so going back to this older, pre-review leaked results:
1664988035907.png
RX 6800 has 20 tf., 3060Ti has16 tf, so the questionis: Why Intel can't compete either?

The difference would again match those lower base clocks eventually. Good if so.

The benches could require small, frequent GPU->CPU->GPU transfers, to figure out the workload form previous results for the next dispatch. And maybe ReBar can not really help to make those small transfers fast?
That's the most interesting question here for me. I'd be worried. Vulkan could avoid such bottlenecks using indirect dispatch, but we never know how such benchmarks are implemented.

Maybe compilers are still not well optimized.

Or Arc has a bad TF to practical performance ratio, and we could not expect big improvements from future drivers.

I'd also love to know how well async compute works, remembering former Intel iGPUs could not do it at all afaik.

Too bad all those benchmarks never tell us anything useful : /
 
The benefit of having RT far outweights any visual loss in fidelity from using DLSS Performance as you can see in my imgsli comparison.
I think Control is definitely up there with respect to the immediate benefits that RT brings, but you should know by now that comparing static scenes is largely pointless to evaluate the image quality with reconstruction tech, especially at its more aggressive scaling factors. That is how we got all those "Better than native!" comments when sites would show DLSS screenshots early on until we started to really look what was actually happening in motion.

There are just too many factors between games, output devices, and personal preference wrt to performance/image quality to make any truly definitive statements like "RT with DLSS is always better" at this point. There's simply too much variation between RT and DLSS implementations in games, not to mention that some gamers will simply be using vastly different displays - I'm sure if I was gaming on a 27" 1440p I probably wouldn't notice much difference between DLSS modes for example, but I game on a 55" 4k set - and in many games, there is a very noticeable difference between DLSS performance and quality. Better RT reflections/ambient occlusion in some games can't wholly compensate for repeatable moiré patterns on surfaces and sudden spurts of extremely aliased geometry because DLSS performance mode shit the bed with a certain alpha blended effect involving a lower res buffer.

I agree that RT+DLSS Quality is better, especially in games with awful, blurry TAA like Cyberpunk. RT+DLSS Balanced I can live with. Anything below that and I rather drop DLSS in favor of native res. Bonus points if TAA isn't built-in or I can disable it. It's beyond me why this method has been favored over other AA methods when it's FXAA all over again with the blur. I get that MSAA and SSAA are punishing on performance but they do wonder for jaggies and fine details. TAA usually smears your screen with vaseline so I prefer DLSS Quality over it a lot of the times.

They do not though? First off, it's more of a case of 'completely unplayable' performance - it's like saying "Yeah I get that 8k is punishing on performance, but" - that's effectively what SSAA is doing. MSAA doesn't have the same hit, but even if it can work, with modern engines it can have a prohibitive performance impact. Those two methods were off the table for the majority of gamers so even if they were miraculous in terms of image quality it wouldn't be an option.

Secondly, MSAA does absolutely nothing for subpixel/specular aliasing. The amount of detail in modern games is completely out of the scope of what MSAA can handle. I mean, it's not hard to see - just look at Forza Horizon and all that horrible aliasing you had before TAA was implemented. It's nothing like FXAA. FXAA makes it blurrier without addressing the specular aliasing - look at what a mass is Arkham Knight is with just FXAA.

There are no doubt poor TAA implementations out there, but it's readily apparent that temporal solutions are absolutely necessary to deal with the subpixel aliasing in modern engines. MSAA/SSAA are not substitutes even if their performance impacts were significantly lowered.
 
So yeah overall, pretty huge hill to climb would be an understatement. Promising results for the hardware, but as Richard said in DF's review, you're basically playing Russian roulette with older games, which is a huge pull for the PC as a platform. What if your favourite DX11 game doesn't fall under Intel's purview for a 'problem' game?

As this point I'm wondering if it's almost better for Intel to leverage the work Valve has done with Proton and devote some resources to making DXVK work better on Windows. It does now for many games, and in some cases will even provide a better experience than native DX9/DX11. However it's basically a 'hey if it works, cool' - using it on Windows instead of Linux isn't a supported use-case and as such, the devs don't devote any time to fixing bugs that arise when using it on Windows.

Granted that assumes Intel's Vulkan drivers are up to snuff as well. From the Gamers Nexus review there's some disparity with frametimes in some games and modern API's too, it's not just DX11.
 
They do not though? First off, it's more of a case of 'completely unplayable' performance - it's like saying "Yeah I get that 8k is punishing on performance, but" - that's effectively what SSAA is doing. MSAA doesn't have the same hit, but even if it can work, with modern engines it can have a prohibitive performance impact. Those two methods were off the table for the majority of gamers so even if they were miraculous in terms of image quality it wouldn't be an option.

Secondly, MSAA does absolutely nothing for subpixel/specular aliasing. The amount of detail in modern games is completely out of the scope of what MSAA can handle. I mean, it's not hard to see - just look at Forza Horizon and all that horrible aliasing you had before TAA was implemented. It's nothing like FXAA. FXAA makes it blurrier without addressing the specular aliasing - look at what a mass is Arkham Knight is with just FXAA.

There are no doubt poor TAA implementations out there, but it's readily apparent that temporal solutions are absolutely necessary to deal with the subpixel aliasing in modern engines. MSAA/SSAA are not substitutes even if their performance impacts were significantly lowered.
I honestly don't remember the last AAA game I played that had MSAA. I think Crysis 3 maybe? I know Forza has it but I've never tried it there.

One of the better implementations I recall is DOOM Eternal and whatever AA solution they use. TAA has been widely inconsistent for me and more often than not, it makes the image look worse. The biggest offender is Cyberpunk 2077 where for some strange reason, it makes the image look a lot softer. Turn it off? The image gets messed up. I think it looks good in the Spider-Man game though if I remember. Not sure as I haven't tried it for long.
 
I honestly don't remember the last AAA game I played that had MSAA. I think Crysis 3 maybe? I know Forza has it but I've never tried it there.

That's certainly one, and it exhibited exactly those problems I mentioned - tons of specular shimmering. Which thankfully, Crysis 3 Remastered fixes with both it's TAA and DLSS.

One of the better implementations I recall is DOOM Eternal and whatever AA solution they use.

That's TAA.

TAA has been widely inconsistent for me and more often than not, it makes the image look worse.

Looks worse than what, though? There are many games like I mentioned where you can compare TAA to SMAA/MSAA - and I'm not aware of any one that actually looks better with them vs TAA in motion. With a 'soft' TAA you can at least improve things for a negligible performance impact with sharpening, without a temporal solution there's basically nothing you can do to address the specular aliasing in any way that's performant.
 
Jesus christ. Did he really not test the most interesting aspects about Arc - RT performance and XeSS?

This guy is such a terrible reviewer. Just incredible.
How hard concept can it be?
There are people who don't care for RT enough for it to matter.
It's infinitely better that we have reviews catering to different interests rather than carbon copy reviews focusing on whatever manufacturer X this time decided to be important.
Those who value RT can skip the review, there's plenty of reviews catering that interest too, read those. The fact some cater to different interests takes nothing away from you.
Same for XeSS, obviously.
 
Can't remember where i've got my 20tf from, but i see it's only 13/11tf.
40W on idle is really bad. Need for PCI4 as well.
My excitement has reduced a lot. But looks not bad and quite competive still.
PCI4.0 vs 3.0 was my main worry. Well, it turns out the performance difference between PCI4 and PCI3 when it comes to GPUs is really negligible, as Intel admitted themselves.

https://www.intel.com/content/www/us/en/gaming/resources/what-is-pcie-4-and-why-does-it-matter.html

The higher bandwidth of PCIe 4.0 and 5.0 may also benefit graphics cards, as higher throughput helps allow quicker transfer of data to VRAM. But while PCIe 4.0 setups outperform 3.0 in synthetic benchmarks, the real-world benefits for gaming are currently minor.

Some tests suggest that even running games in 4K with current graphics cards won’t saturate the bandwidth of a PCIe 3.0 x16 slot. There may be minor FPS advantages when comparing the same GPU running in PCIe 4.0 configuration against 3.0, but the differences are small enough to be unnoticeable.

After watching and reading the reviews -specially the DF one- I am going to buy the A770 16GB. It is going to be my first first-day GPU to date.

I've saved the money for this moment -waiting for this new nVidia 4000 generation and RDNA3, but my GPU is dying and I like Intel-, so I can retire my GTX 1080 which has ran its course.
 
Last edited:
It's infinitely better that we have reviews catering to different interests rather than carbon copy reviews focusing on whatever manufacturer X this time decided to be important.
DXR is an industry standard that is featured in almost all new games nowadays and has been available for 4 years now, it's the next level "Ultra" settings on PC, it's not a "whatever feaure manufacturer X decided to be important", it's literally the only Ultra settings in town right now, it's the latest and greatest API on PC.

Not featuring any ray tracing tests in a review, is like not benchmarking any DX12 or DX11 games 4 years after they are released, which is insane!
 
Last edited:

it's the next level "Ultra" settings on PC, it's not a "whatever feaure manufacturer X decided to be important", it's literally the only Ultra settings in town right now.

Theres still beyond console normal raster settings on PC, often called Ultra or beyond. Then comes RT aswell, which is the largest differeniator.
 
How hard concept can it be?
There are people who don't care for RT enough for it to matter.
It's infinitely better that we have reviews catering to different interests rather than carbon copy reviews focusing on whatever manufacturer X this time decided to be important.
Those who value RT can skip the review, there's plenty of reviews catering that interest too, read those. The fact some cater to different interests takes nothing away from you.
Same for XeSS, obviously.
Can't agree with that. A major feature of these cards is their ray tracing capabilities. Can't review them and pretend that one of their greatest strengths doesn't exist. The A770 sometimes outperforms even the 2080 in RT. That definitely shouldn't be overlooked.
 
PCI4.0 vs 3.0 was my main worry. Well, the performance difference between PCI4 and PCI3 when it comes to GPUs is really negligible, as Intel admitted themselves.
Probably it should have said Resiazebale Bar, not PCI4.
Intel states that you basically must have the Resizable Bar feature on your CPU. Without it, Arc performance is just too bad. Afaik the feature was introduced with PCI4, but requires a new CPU as well, and enabling it in bios.
So checking your bios might be the easiest way in case you're not sure. On AMD platform it might be called SAM, iirc. 3700X has it i think. I have some 2700, which has not.
 
DXR is an industry standard that is featured in almost all new games nowadays and has been available for 4 years now, it's the next level "Ultra" settings on PC, it's not a "whatever feaure manufacturer X decided to be important", it's literally the only Ultra settings in town right now.

Not featuring any ray tracing tests in a review, is like not benchmarking any DX12 or DX11 games 4 years after they are released, which is insane!
At this time it's still a feature that for many costs too much for the gained benefits.
I repeat, some reviews catering to different needs take literally nothing away from you, you still have piles of essentially carbon copy reviews to choose from.
It's almost as if some here are personally offended that not everyone thinks the way they do about everything.

Can't agree with that. A major feature of these cards is their ray tracing capabilities. Can't review them and pretend that one of their greatest strengths doesn't exist. The A770 sometimes outperforms even the 2080 in RT. That definitely shouldn't be overlooked.
I repeat again, it takes nothing away from you. Even if it's strong point of the card, some people literally don't care for RT at this time, but they still might consider any card for their other features.
 
Probably it should have said Resiazebale Bar, not PCI4.
Intel states that you basically must have the Resizable Bar feature on your CPU. Without it, Arc performance is just too bad. Afaik the feature was introduced with PCI4, but requires a new CPU as well, and enabling it in bios.
So checking your bios might be the easiest way in case you're not sure. On AMD platform it might be called SAM, iirc. 3700X has it i think. I have some 2700, which has not.
my Ryzen 3700X didn't support resizable BAR either. It's a mobo thing. My motherboard, the Asrock B450M Steel Legend, didn't support it initially, but as of recently I downloaded a recent BIOS update and it has been supporting Resizable BAR for quite a while now. I enabled it yesterday, btw. Dunno how it works 'cos nVidia only enables it for the RTX 3000 series afaik and I have a GTX 1080.

AMD support for AM4 is quite remarkable.

Which motherboard do you have btw?

As you -and DF review- mentioned, resizable BAR is a must for ARC gpus no matter what.

On a different note, afaik AMD allows their gpus to use resizable BAR in every game, and in the case of nVidia, you can also force it to be used in every game.

 
I repeat again, it takes nothing away from you. Even if it's strong point of the card, some people literally don't care for RT at this time, but they still might consider any card for their other features.
I don't care for RT on cards with that performance profile so you're preaching to the choir. However, this is irrelevant. RT is very strong on this card and it's a reviewer's job to bench it. We're not talking about a 3050 with laughable capabilities that no one will use. RT is 100% usable on the A770/A750.

Some people also don't care for noise level, OC, or power consumption. A review without them is still incomplete.

Also, calling other reviews "carbon copies" because they review the full set of features is just bizarre. You shouldn't spin incomplete reviews into something positive.
 
DXR is an industry standard that is featured in almost all new games nowadays and has been available for 4 years now, it's the next level "Ultra" settings on PC, it's not a "whatever feaure manufacturer X decided to be important", it's literally the only Ultra settings in town right now, it's the latest and greatest API on PC.

Not featuring any ray tracing tests in a review, is like not benchmarking any DX12 or DX11 games 4 years after they are released, which is insane!
Agree, but let's do some prediction...
Jensen says Mores Law is dead. In other words, RTX5000 will cost 50% more for 50% more perf. That's 3000 then.
A year later, 4000.
That's expensive. We have a two classes society in gaming, and many if not most people will be annoyed from ultra or psycho settings.
Then, there is a total reason for some reviewers to ignore some of this, depending on which class they associate with.
I would not wonder even if games would list ultra settings only in case an ultra GPU is detected. To minimize the rising social war across gamers at least a bit.
If so, some games might even advertise with 'Psycho RTX off'.

Is this just dystopian fantasies of mine, or a realisitc expectation?
Is it reality already now?
 
At this time it's still a feature that for many costs too much for the gained benefits.
As has many other new feature on PC since the dawn of time, doesn't mean a reviewer should just discard new APIs because of "reasons".
you still have piles of essentially carbon copy reviews to choose from
No review is a carbon copy of the others, each review tests different games, scenes and settings. choosing to test only 12 raster games (8 of which have ray tracing in them) and then disabling ray tracing in them completely while testing isn't catering to different needs, it's just lazy half assed reviewing business.

They tested:
F1 2021
Watch Dogs Legion
Shadow of the Tomb Raider
Hitman 3
Farcry 6
Cyberpunk 2077
Dying Light 2
Spiderman Remastered

Never once switching the ray tracing options in these games, hypocricy at it's best.
 
At this time it's still a feature that for many costs too much for the gained benefits.
I repeat, some reviews catering to different needs take literally nothing away from you, you still have piles of essentially carbon copy reviews to choose from.
It's almost as if some here are personally offended that not everyone thinks the way they do about everything.


I repeat again, it takes nothing away from you. Even if it's strong point of the card, some people literally don't care for RT at this time, but they still might consider any card for their other features.
Agreed, I too don't understand why some just seem opposed some sites/reviews existing, when they can just forget about it and choose the various other channels that are more to their liking.

Being able to choose is a good thing, since when did the expectations of PC gamers being able to look around for alternatives become so low? My preferred to go site for the detailed settings comparisons is Digital Foundry, but I have to resort to other sites that do less detailed comparisons but in turn have tested alot more GPUs or CPUs.
 
At this time it's still a feature that for many costs too much for the gained benefits.
I repeat, some reviews catering to different needs take literally nothing away from you, you still have piles of essentially carbon copy reviews to choose from.
It's almost as if some here are personally offended that not everyone thinks the way they do about everything.

Agreed, I too don't understand why some just seem opposed some sites/reviews existing, when they can just forget about it and choose the various other channels that are more to their liking.

Reviewing reviewers is part of the process too, people can have opinions on the relative competence that youtubers display. Of course there are many other channels, but we're going to debate their relative worth if they're going to be cited as an authoritative source.
 
Status
Not open for further replies.
Back
Top