Intel ARC GPUs, Xe Architecture for dGPUs [2018-2022]

Status
Not open for further replies.
What a pathetic excuse.


Complete means: test with the latest API, that's the bare minimum effort for reviewing, I don't suppose you'd be content with a review that only tests DX11 games, right?
Yes, I would be content with DX11-only review. I would be content with OpenGL-only review too. Or RT only review. Just like any other. Personally I don't care for those much at all, but them existing doesn't take anything away from me and can cater to someone else's needs.

A review isn't about who cares for what! it's about testing the limits of the products in the current software and future software environment.
Only in your very limited view. Review can focus on whatever its creators want it to focus on, and it doesn't take anything away from its credibility so long as the tests are conducted properly on whatever it is.
 
You mean it requires HW acceleration? On minimum specs? If so - does it have a release date?
It does not require HW acceleration. It's using Software Raytracing on older cards, while the RT is hardware accelerated on modern GPUs for way better performance and probably quality too.

It was supposed to release next year, but its now moved to 2023-2024.
 
They are right though. Arc has far better RT capabilities and accelerates ML with matrix cores. It will age much better compared to RDNA2.

It's a no brainer really, Arc is the way to go.

In the timespan of how long the card will be relevant at its price point, I can't see that at all. You're basically crossing your fingers that a massive swath of available games will be fixed in relatively rapid timeframe for the sake of your card performing better in RT games at a relatively low resolution.
 
AMD will follow suit according to rumors RT and ML acceleration like Intel and NV.
Again, since you skipped it last time:
AMD already has both RT and ML acceleration. If they wanted to have their matrix cores in gaming GPUs, they would be there.
 
In the timespan of how long the card will be relevant at its price point, I can't see that at all. You're basically crossing your fingers that a massive swath of available games will be fixed in relatively rapid timeframe for the sake of your card performing better in RT games at a relatively low resolution.
The baseline of current generation games won't change, because the consoles are the lowest common denominator. These ARC cards will age very well at high settings which will look way better than anything you've seen up till now, because then these games will be actually made from the ground up for current gen consoles instead of PS4 and Xbox One.
 
Again, since you skipped it last time:
AMD already has both RT and ML acceleration. If they wanted to have their matrix cores in gaming GPUs, they would be there.

Like Intel and Nvidia have currently, that kind of RT and ML acceleration, dedicated cores to them. Digital Foundry mentioned the exact same things, that AMD will follow suit to what Intel and NV are doing. Heck there are rumors that indicate RDNA3 will.

The baseline of current generation games won't change, because the consoles are the lowest common denominator. These ARC cards will age very well at high settings which will look way better than anything you've seen up till now, because then these games are actually made from the ground up for current gen hardware.

With these 350usd Arc gpu's, you get somewhat above console rasterization capabilities (probably more when drivers are optimized, were talking 17+ TF gpus here), but with RT and ML capabilities that far outmatch them, teamed to fast 16GB Vram at very reasonable TDP. And thats early days.
Its RT and XeSS are very promising, these GPUs are right in the place where most GPUs are sold, the mainstream.
 
So far TPU's reviews have been my favourite - the 3 highlights:

1) 40-45w power draw at idle with fans spinning (yikes!)

2) Great RT implementation, every single game tested has a performance drop very close to that of Ampere, and much better than AMD, and at least with what TPU tested, no weird outliers or 'gotchas'

3) Wow do some games ever crater in performance without ReBar enabled - Guardians of the Galaxy may legitimately be slower than the hottest APU like a 6800HX versus on the A770. (double yikes!)
 

Attachments

  • guardians-of-the-galaxy-2560-1440.png
    guardians-of-the-galaxy-2560-1440.png
    59.1 KB · Views: 10
The baseline of current generation games won't change, because the consoles are the lowest common denominator. These ARC cards will age very well at high settings which will look way better than anything you've seen up till now, because then these games will be actually made from the ground up for current gen consoles instead of PS4 and Xbox One.

I just can't imagine recommending a card based on potential hypothetical performance from unreleased games years in the future for use in a platform that has backwards compatibility as one of its key advantages. Disregarding the performance/compatibility of such a massive existing gaming library for future promises is absolutely insane to me.

Drivers are extremely complex, this was my #1 concern with Arc since the announcement, and it's been proven true. I desperately hope they can improve things in relatively short order, but there are definite reasons to fall on the skeptical side of that timeframe.

If you can wait for these RT-only games which will be coming in 2023/24/25(?) that Arc will excel at, you can wait a few months to at least see how Intel is progressing on their DX11 drivers. At this point, I think recommending this card to anyone over the equivalent AMD is just irresponsible given the issues your average user will encounter (issues again, that Ars's own review goes on to talk about at length right after that recommendation!).
 
I just can't imagine recommending a card based on potential hypothetical performance from unreleased games years in the future for use in a platform that has backwards compatibility as one of its key advantages. Disregarding the performance/compatibility of such a massive existing gaming library for future promises is absolutely insane to me.

Drivers are extremely complex, this was my #1 concern with Arc since the announcement, and it's been proven true. I desperately hope they can improve things in relatively short order, but there are definite reasons to fall on the skeptical side of that timeframe.

If you can wait for these RT-only games which will be coming in 2023/24/25(?) that Arc will excel at, you can wait a few months to at least see how Intel is progressing on their DX11 drivers. At this point, I think recommending this card to anyone over the equivalent AMD is just irresponsible given the issues your average user will encounter (issues again, that Ars's own review goes on to talk about at length right after that recommendation!).
To say nothing of the possibility that Battlemage may have its own separate driver stream or optimization paths even if the drivers are unified, and once it's released, Intel may suddenly care a lot less about going back and manually tweaking/optimizing the Arc series drivers.
 
As has many other new feature on PC since the dawn of time, doesn't mean a reviewer should just discard new APIs because of "reasons".

No review is a carbon copy of the others, each review tests different games, scenes and settings. choosing to test only 12 raster games (8 of which have ray tracing in them) and then disabling ray tracing in them completely while testing isn't catering to different needs, it's just lazy half assed reviewing business.

They tested:
F1 2021
Watch Dogs Legion
Shadow of the Tomb Raider
Hitman 3
Farcry 6
Cyberpunk 2077
Dying Light 2
Spiderman Remastered

Never once switching the ray tracing options in these games, hypocricy at it's best.

I was waiting for someone to point it out to you, but you have missed one minor detail ... F1 2021 was tested with RT Medium ON so the whole debate is a moot. Reviewer did run RT test and promised to revisit this topic with more tests done in the future.

If you come to a video / article with predefined bias, you might miss little things like that :)
I too would love to have all games with all GFX options tested by one outlet on the product launch but that is unrealistic. Thankfully we have multiple outlets doing testing and approaching that task from different angles :)
 
Didn't I read these chips went through a lot of steppings? I thought it was on here but I can't find the post. Maybe the ReBar thing is actually something they decided to leave broken. :D
 
I was waiting for someone to point it out to you, but you have missed one minor detail ... F1 2021 was tested with RT Medium ON so the whole debate is a moot. Reviewer did run RT test and promised to revisit this topic with more tests done in the future.

Ok I missed that too, credit to them for that at least. They also give indication they will be following it up with a separate video focused on RTX.

On a more positive note, I gotta say I really like the design of the card. I'm definitely in the understated camp in design, I like some RGB but really want simple, clean lines in most of my products. Just 2 fans, actually just 2 slots deep, not too long, and it can cool it at a decent noise level.
 
Ok I missed that too, credit to them for that at least. They also give indication they will be following it up with a separate video focused on RTX.

On a more positive note, I gotta say I really like the design of the card. I'm definitely in the understated camp in design, I like some RGB but really want simple, clean lines in most of my products. Just 2 fans, actually just 2 slots deep, not too long, and it can cool it at a decent noise level.

Yes, on one hand I want to own that piece of history in making as the last Intel graphics card I owned was Intel i740 4MB and it did really well in flight sims I played at that time.
This card also does really well in few titles, but fails in others. It will be fascinating to follow driver development and revisit games in 3, 6 or 12 months time.
 
So far TPU's reviews have been my favourite - the 3 highlights:

1) 40-45w power draw at idle with fans spinning (yikes!)

2) Great RT implementation, every single game tested has a performance drop very close to that of Ampere, and much better than AMD, and at least with what TPU tested, no weird outliers or 'gotchas'

3) Wow do some games ever crater in performance without ReBar enabled - Guardians of the Galaxy may legitimately be slower than the hottest APU like a 6800HX versus on the A770. (double yikes!)
the 3rd review is one of my favourites. It explains in superb detail one of the features I feared the most -not having a PCI 4.0 mobo-. The performance difference between PCI 3.0 and PCI 4.0 is totally negligible, and even there are a few cases where PCI 3.0 performs better, so yeah, it's a fine GPU for my rig.

Another reason why I like these GPUs is that they have enough power in a contained power budget, and they are within the possibilities of my power supply -550W-, and the 225W of the Arc A770 does not make a big difference compared to the 170W of my GTX 1080.
 
That's certainly one, and it exhibited exactly those problems I mentioned - tons of specular shimmering. Which thankfully, Crysis 3 Remastered fixes with both it's TAA and DLSS.



That's TAA.



Looks worse than what, though? There are many games like I mentioned where you can compare TAA to SMAA/MSAA - and I'm not aware of any one that actually looks better with them vs TAA in motion. With a 'soft' TAA you can at least improve things for a negligible performance impact with sharpening, without a temporal solution there's basically nothing you can do to address the specular aliasing in any way that's performant.
Now that you mention, I wonder which AA solution was used in Skyrim. I remember when I first played it on the Xbox 360 which ran games at native 720p most of the time, that the thing that struck me the most was how awed I felt by the total lack of jaggies and the cleanliness of the image.

While not the worst of its generation, let's say 99% of the games on the X360 were an aliasing/shimmering fest. But when I played Skyrim I couldn't notice any.
 
The difference between DLSS presets is fairly minor. If you're distracted by upscaling artifacts at DLSS Performance you will most likely see the same artifacts in Balanced and Quality. The only tier which is always noticeably worse is Ultra Performance.

You don't have to use DLSS in games which use RT lightly as they tend to not hit performance much either. Generally DLSS+RT is always a better choice from image quality pov than native without RT.
It isn't minor when you are at a lower resolution like 1080p.
 
Like Intel and Nvidia have currently, that kind of RT and ML acceleration, dedicated cores to them.
Their matrix cores are as dedicated as tensor or XMX cores. They simply chose not to include them in gaming GPUs. They might change their opinion in the future, but the option was there for RDNA2 already if they wanted it.
 
Status
Not open for further replies.
Back
Top