Intel ARC GPUs, Xe Architecture for dGPUs [2018-2022]

Status
Not open for further replies.
Jesus christ. Did he really not test the most interesting aspects about Arc - RT performance and XeSS?

This guy is such a terrible reviewer. Just incredible.
Note that many reviewers don't conduct testing in everything for one video and in a short time. Gamers Nexus didn't review anything RT or XeSS related, and they're considered one of the best reviewers out there. It takes a lot of time to test the products, especially given the amount of issues had with this GPU and a lot of these YT outfits want to have separate content pieces focusing on things like RT, for obvious reasons.
 
Which are?

Personally I'm not even interested in a GPU's non-RT performance these days as I'm quite sure that it's "fine". Making a review of a new GPU completely omitting it's RT capabilities seems like a sure way of loosing my interest.
I agree. I enable RT in every there is (except for CP2077 as 60 fps is not possible there and at 30 FPS I get VRAM issues) and I'm running a super weak laptop 2060. Personally, I don't get why anyone would want to play without RT. If I can get over 60 fps on nearly every game there is with the help of DLSS+ my tweaked settings then desktop cards like a 3070 could easily get over 100 fps at the same settings or a little below with higher DLSS quality. In games with good implementation, RT looks a lot better than maxed out, native and ultra.

For example, here's Control on my 2060 laptop. One native, maxed out, no RT, and the other one with DLSS performance, tweaked high/medium settings and RT. Control with Raytracing on looks much better and runs nearly twice as fast, providing pretty stable 60 FPS even in the corridor of doom.
 
Jesus christ. Did he really not test the most interesting aspects about Arc - RT performance and XeSS?

This guy is such a terrible reviewer. Just incredible.
They probably figure these GPUs are too weak to handle RT for it to truly matter. At 1440p, most games drop to the low 40's, and Cyberpunk is in the 20s. I personally wouldn't bother with RT on anything less than an RTX 3070/2080 Ti but it's fair to ask for RT benchmarks nonetheless. It could also come in a separate benchmark. HU has been hard at work the past few weeks with a lot of content so they might not be able to cover everything in one video.
 
Yeah, but this isn't anything new really. Same as A380 it falls between AMD and Nvidia in RT. Certainly not above Nv though.
Well, in many heavy RT titles, the A770 tends to be ahead of the 3060 though, by a 20%+ margin in several cases, games like Metro, Control, Dying Light 2 .. some games though still has the 3060 slightly ahead, like Guardians of The Galaxy, and Cyberpunk. In general the lead of the A770 over the 3060 in heavy ray tracing is solid, 3060Ti remains out of reach though.

However, in rasterization the A770's lead over the 3060 is less convincing, sometimes it's 10% in DX12 sometimes it's slower. Problem is frame times are not consistent, DX11 titles are bad, and DX9 titles run through emulation and lose half of their performance or more, they also run like dogshit, Counter Strike GO has the A770 running with 3X worse frame times than even a 3050!

 
Can't remember where i've got my 20tf from, but i see it's only 13/11tf.
40W on idle is really bad. Need for PCI4 as well.
My excitement has reduced a lot. But looks not bad and quite competive still.
 
Can't remember where i've got my 20tf from, but i see it's only 13/11tf.
40W on idle is really bad. Need for PCI4 as well.
My excitement has reduced a lot. But looks not bad and quite competive still.
20 is the theoretical max throughput at 2,4 GHz. 13 sounds like the ALUs running at 1,6. Or driver issues.
 
They probably figure these GPUs are too weak to handle RT for it to truly matter.
Which is wrong, as I've said and demonstrated one post above you. Even a laptop 2060 can handle RT very well if tweaked correctly.

And of course these GPUs have low framerate at maxed settings and at native resolutions. The key is to use XeSS/DLSS and use sensible settings customized to the hardware at hand.
 
20 is the theoretical max throughput at 2,4 GHz. 13 sounds like the ALUs running at 1,6. Or driver issues.
Aha! Saw those low numbers on a AIDA64 benchmark. Maybe this benchmark does not trigger boost.
I have a similar case, running my GI stuff on Vega56, it keeps desktop clock of 300 MHz.

Excitement reestablished, almost. But my CPU is one gen too old, and i still have to get some RDNA2 to estimate console perf anyway...
 
Which is wrong, as I've said and demonstrated one post above you. Even a laptop 2060 can handle RT very well if tweaked correctly.

And of course these GPUs have low framerate at maxed settings and at native resolutions. The key is to use XeSS/DLSS and use sensible settings customized to the hardware at hand.
You have to use DLSS performance which looks quite terrible compared to native res or even balanced/quality. I agree that they should have tested it because ultimately, it's up to the consumer to decide whether or not RT is worth the performance/resolution hit, not the reviewer.
 
You have to use DLSS performance which looks quite terrible compared to native res or even balanced/quality. I agree that they should have tested it because ultimately, it's up to the consumer to decide whether or not RT is worth the performance/resolution hit, not the reviewer.
The benefit of having RT far outweights any visual loss in fidelity from using DLSS Performance as you can see in my imgsli comparison. And given my laptop 2060 is much weaker than a Desktop 2060, you can use higher quality DLSS settings on the desktop.

Also, I disagree that DLSS Performance looks bad. It comes quite close to native rendering actually when I compare it on my 1440p monitor and with a tiny bit of sharpening it gets even closer.
 
The benefit of having RT far outweights any visual loss in fidelity from using DLSS Performance as you can see in my imgsli comparison. And given my laptop 2060 is much weaker than a Desktop 2060, you can use higher quality DLSS settings on the desktop.
I don't agree with that. DLSS performance looks a lot worse than native and RT doesn't offset that. The image comparison is also flawed. DLSS really gets tested in motion and the issues present in Quality mode are further exacerbated by lower res modes.
Also, I disagree that DLSS Performance looks bad. It comes quite close to native rendering actually when I compare it on my 1440p monitor and with a tiny bit of sharpening it gets even closer.
It really doesn't look close to native and there are myriads of tests out there disproving this claim. A sharpening filter doesn't make the problems such as ghosting and other artifacts go away.

Once again, I agree that he should have tested RT but let's not start pretending that DLSS performance is something desirable or almost as good as native. It really isn't.
 
DLSS Performance with RT is a lot better visually than native without RT.
Don't agree with that. A softer image with visual artifacts is omnipresent, RT isn't and in many cases, you barely notice it's even there. You never fail to notice the blur or breaking of the image in motion. It varies on a game-to-game basis but I never use anything less than RT Quality on a 3440x1440 monitor. The lower your native res, the worse it gets.
 
Yeah that's unfortunate. Have there been other recent cards like this? It definitely reminds of circa 2006-2008 gpus though.

It seems like it has only one clock state for memory, so it runs at full memory clocks even at idle which likely is the main cause.

AMD cards to an extent have a similar issue in that they need to run memory at full speed in many multi monitor setups making them idle that high as well.

Nvidia nowadays has the strongest behaviour here in terms of how low memory clock speeds go down in non heavy workloads. Although even they still need to ramp to full in some multi monitor setups (typically combining multiple high refresh/high res).
 
Don't agree with that. A softer image with visual artifacts is omnipresent, RT isn't and in many cases, you barely notice it's even there. You never fail to notice the blur or breaking of the image in motion. It varies on a game-to-game basis but I never use anything less than RT Quality on a 3440x1440 monitor. The lower your native res, the worse it gets.
The difference between DLSS presets is fairly minor. If you're distracted by upscaling artifacts at DLSS Performance you will most likely see the same artifacts in Balanced and Quality. The only tier which is always noticeably worse is Ultra Performance.

You don't have to use DLSS in games which use RT lightly as they tend to not hit performance much either. Generally DLSS+RT is always a better choice from image quality pov than native without RT.
 
Status
Not open for further replies.
Back
Top