Intel ARC GPUs, Xe Architecture for dGPUs [2018-2022]

Status
Not open for further replies.
Yes but who will you believe though?
Some Linus guy no one knows who claims to have seen it live or a gloriously compressed low bitrate video and its stills?
I mean come on...

Pffft, c'mon. Intel will obviously deviate from all standard PR practices to show the first tease of their product off raw and under poor conditions. I mean showing off something entirely limited to what you know beforehand will be perfectly smooth and flawless, who would do that in a demo???
 
What kind of upcoming industry events are there where it'd make sense for Intel to show the dGPU off further? I'm curious to see more.
 
Intel showed the first development desktop dGPU cards for DG1, it's a sub 75w card, it was running Warframe 1080p at an unstable fps, which also makes the claim that Destiny 2 was running at a locked 1080p60 bogus, as Destiny 2 is a far more taxing game than Warframe.

DG1_Box_575px.jpg


https://www.anandtech.com/show/1536...ng-teased-at-ces-2020?utm_source=notification

That short demo didn't looked stable and it was a dark place that you couldn't see and judge image quality.
Aliasing is also horrendous, you don't get that from bad streaming, you get it when there is no AA and when you run the game at lower resolutions.

GamerNexus tested the card (in the laptop format) personally, they found Destiny 2 to be running @1080p "low settings", the fps varied between 30fps to 45fps, with horrendous input lag.

 
Last edited by a moderator:
I guess the point was to show that they have working silicon and that it can actually render games properly. Whether it's going to competitive or not in the gaming space when they get to more final versions with better drivers? Likely it will take a few generations to get to that stage.
 
I guess the point was to show that they have working silicon and that it can actually render games properly. Whether it's going to competitive or not in the gaming space when they get to more final versions with better drivers? Likely it will take a few generations to get to that stage.

We know they can already do that with their existing igpu. I really don't get why they show that running like crap to the world. To the devs, sure. Show them, send them kits, etc. But to "us" ?
 
30 fps in warframe is pretty bad, an mx250 can do that with the settings on max at 1080p and with dynamic resolution scaling disabled.

Was that enabled in the demo? Because that would be even worse.

It sounds like its about at the same level as Ice Lakes integrated graphics, the lowliest 32 eu ones.
 
You're right, and you know what this is a good thing. This shows that there is a lot going on under the hood besides taking an IGP duplicating some blocks and wiring up some gddr to it.

When everything is optimized its gonna be way faster.
 
I thought I read that DG1 is rather weak. Better than an IGP but meh.

https://www.techpowerup.com/262404/intel-dg1-discrete-gpu-shows-up-with-96-execution-units

It's called a development kit. Makes one wonder what the goal is for publically demoing it. Maybe developers are supposed to want to play with it.

Intel is basically new at this. Someone saw this as a PR opportunity and took it. Most PR isn't "this is the greatest thing ever" rather the purpose is "Hey this exists!" so it stick in peoples heads. The long term goal is probably when Xe consumer products get launched quite a ways from now it won't be some jarring surprise of "Wait, Intel makes graphics cards??????", at least not that for as many people as they can.
 
Last edited:
Status
Not open for further replies.
Back
Top