Intel ARC GPUs, Xe Architecture for dGPUs [2022-]

Is it a decent quality cable? How long is it?
2 meters or so, purchased on a big mall in the videogames section -they don't place those in the section where the other typical cables like HDMI 2.0 or Ethernet are-. It wasn't cheap tbh.

So after contacting Intel the thing is that the sole fact games don't even launch just because you are using a TV connected to your "HDMI 2.1" output but run like a charm using a classic monitor via Displayport says that something is amiss.

It turns out that the card's HDMI is not a native HDMI 2.1 port, it's converted from DP via LSPCON.

This is why people aren't seeing the option to select 10/12bit color etc in the Intel Graphics Command Center. Explained by Intel:

 
Last edited:
I would also consider the possibility of the TV bring janky. You never know what to expect with TV's connected to PC. I have a 4K HDR Vizio that always chroma subsamples unless I run a custom resolution.
hahah, hope not, I want it to last many years, 'cos I sated my interest on TVs, and in the dissssstant future I hope to get a new monitor, a native 4K one with decent framerate and good HDR but monitors are expensive and also I am not in a hurry. A NUC, a portable gaming PC a la Steam Deck and VR, and that should be it.

VRR on the TV doesn't function either (on the monitor via Displayport, it behaves like the angels), and talking to some people they told me that a Displayport to HDMI 2.1 converter should do the job, but that's not guaranteed to work. There are people who got it to work with a different TV model, I did the same but it didn't want to collaborate.
 
I've read a bit about how monitor manufacturers play it loose with respect to VRR. They support it but don't necessarily make sure it works well. Nvidia brought this up when they started doing gsync compatible mode for freesync monitors.
 
I've read a bit about how monitor manufacturers play it loose with respect to VRR. They support it but don't necessarily make sure it works well. Nvidia brought this up when they started doing gsync compatible mode for freesync monitors.
VRR is the only feature not enabled in Game Mode right now. I know it works 'cos I've read the reviews and seen many videos of people using it on my TV's model. As for Freesync and so on, the quality varies. My monitor is Freesync Premium, but the same monitor model in earlier iterations and firmwares was Freesync 2.

Plus I had a 240Hz Gsync Compatible monitor back in the day and it was far from being Gsync Ultimate -those were expensive-. Freesync Premium works generally okay for me save for huge disparities of 30-40fps from one frame to another where you might notice a prompt stutter.
 
Last edited:
Confirmed my suspicion regarding the HDMI 2.1 port. When the computer is connected to the TV via HDMI 2.1, any time I try to launch Resident Evil 0, Resident Evil HD Remake, Resident Evil 6, Ultimate Capcom vs Marvel 3 and Resident Evil Revelations 1 and 2, the games won't launch at all. A very quick black splash screen and then, nothing, gone.

Of course, once you connect the GPU to the monitor via Displayport, everything is going like clockwork. VRR is wonderful and those games are a sight to behold running smootly.

In the end, I told Intel and asked them if there is a fix for no VRR and game incompatibilities over the HDMI 2.1 port using a HDMI 2.1 cable connected to a native HDMI 2.1 port of the TV.
 
@Cyan - I think the world would like to know whether the A770 has relatively angle independent anisotropic filtering. :D
https://www.3dcenter.org/download/d3d-af-tester


RTX 2080 on default everything. (Quality filtering setting, AF sample optimization off, trilinear optimization on.)
2080-1.png2080-2.png


(High Quality filtering, AF sample optimization off, trilinear optimization off.)
2080-5.png
 
Last edited:
@Cyan - I think the world would like to know whether the A770 has relatively angle independent anisotropic filtering. :D
https://www.3dcenter.org/download/d3d-af-tester


RTX 2080 on default everything. (Quality filtering setting, AF sample optimization off, trilinear optimization on.)
View attachment 7581View attachment 7582


(High Quality filtering, AF sample optimization off, trilinear optimization off.)
View attachment 7585
this is what you get. Are the settings right? Default settings (Tinted was the checked option in the Mipmaps section but I checked the Full Colored as in your screengrab)

Gp9FQoq.png


High Quality settings, AF x 16.

oU3j3aa.png
 
That looks like superb AF and trilinear quality. Better than NV even if set to high quality in the NV control panel.
Looks pretty much identical to AMD (RDNA2/6800XT, "High" Texture Filtering Quality (also Surface Format Optimization "On" if it makes a difference) in AMD Software)
1669218108775.png

edit: quick photoshop comparisons says practically identical (overlaid on top of each other with Intel on top using "Difference" blending)
1669218322769.png

So for whatever reason NVIDIA is the only one skimping on AF anymore
 
AF angle dependence and also trilinear mipmap optimizations. The trilinear behavior can be improved in the NV control panel but AF angle dependence can not. Though I'm not sure how visible any of this is in a game. It's still pretty high quality compared to what it once was.
 
So for whatever reason NVIDIA is the only one skimping on AF anymore
The most likely reason is that you don't see this "skimping" anywhere but in such colored synthetic scenes.
Most people don't really see the difference between 8x and 16x AF even.
And in many modern games it can be hard to see much benefits above 4x because of the general lack of large flat surfaces.
So the good old "don't fix it if it's not broken" apply in full.
 
The most likely reason is that you don't see this "skimping" anywhere but in such colored synthetic scenes.
Most people don't really see the difference between 8x and 16x AF even.
And in many modern games it can be hard to see much benefits above 4x because of the general lack of large flat surfaces.
So the good old "don't fix it if it's not broken" apply in full.
a game like Shadow of the Tomb Raider could be a good example of this. The developers recommend to set the game to High settings for capable computers. At High, AF is set to AF x 4 by default.

If you switch to the Highest settings the game sets AF to x 8 (x16 is available as an option, but it's never used in the possible default configurations).

Tbh, I didn't notice a difference with AF set to x4 after using High settings. I was pretty sensible to it in the PS3/360, where textures appeared to be muddy relatively close to the player.

So yeah, use AF x 16 with caution, maybe AF x 8 is more than enough. Maybe in games like the future System Shock Remake, the more AF the merrier.

Judgment and Lost Judgment are also games where some extra AF helps 'cos of the flat and long of the Tokyo's urban environment.
 

kinda agree with the review overall. After more than a month with an Intel ARC GPU, I'd be lying if I were to recommend it to everybody, but I'd be lying too if I said that I am not quite happy with it.

For most people that might not need a new GPU until 2025-2026, waiting for Battlemage or Celestial and also see what AMD and nVida have to offer by then...sounds reasonable.
 
Last edited:
Back
Top