Intel ARC GPUs, Xe Architecture for dGPUs [2018-2022]

Status
Not open for further replies.
Grrrr of course it doesn’t have hdmi 2.1. Would’ve been an instant buy.
For some reason even a handful of newer monitors have DisplayPort and only HDMI 2.0, at least the reasonably priced ones I looked at. The PC space really dragging their feet to get on HDMI 2.1.
 

1kbeisghg3akahT1.jpg
 
Here's an interesting thread started by a former Intel employee with some neat insights ...


He concludes that driver quality is tied to hardware design ...

Intel HW arguably doesn't start off with the best binding model so they're already at a disadvantage. They make a strange tradeoff between register pressure and SIMD lane widths rather than register pressure and occupancy as is the case with others which is why they have issues with games assuming that their shaders will run with a specific SIMD lane width of 32 hence the software/driver interaction issues. Amusingly, they also recommend to avoid anisotropic filtering on sRGB textures which happens in just about every game!
 
I don't see where he said the last paragraphe of your message.

He's on the linux side of stuff yeah ?

The last paragraph is just my explanation as to why Intel HW is not a sane design. Only relation to the thread I mentioned is that lesser hardware design necessitates inferior driver quality ...
 
Confirmed XeSS support:
  1. Call of Duty: Modern Warfare II
  2. Arcadegeddon
  3. Ghostwire Tokyo
  4. Vampire Bloodhunt
  5. Ghostbusters Spirits Unleashed
  6. Naraka Bladepoint
  7. Super People
  8. Gotham Knights
  9. DioField Chronicles
  10. Dolmen
  11. Chivalry II
  12. Redout II
  13. The Settlers
  14. Death Stranding: Director’s Cut
  15. The Rift Breaker
  16. Hitman III
  17. CHORVS
  18. Shadow of The Tomb Raider
  19. Anvil Vault Breakers
 
+ 2 chinese(?) games which logos are in the slide but not listed by videocardz
 
Once the XeSS SDK is out, it would be nice if someone did a proper algorithm comparison with some open source demo, instead of games made with embedded developers and outright payola.

Say 16K downscaled to 4K as gold standard reference and 1920*1080 upscaled with XeSS, FSR 2 and DLSS 2 to 4K. Then some video and frame comparison, plus VMAF perceptual quality metric relative to the gold standard for some objective measure.
 
I forgot that they mentioned dp4a has a different quality to the xmx version.
In video they specifically say it uses a different neural net.
So will be interesting to see the actual difference in IQ as well as the expected performance difference.

So would be interesting to see how they both compare at different resolutions from 720p upto 4k, as well as to the competition.
 
Status
Not open for further replies.
Back
Top