Intel ARC GPUs, Xe Architecture for dGPUs [2018-2022]

Status
Not open for further replies.
H/w RT is adding like less than 10% to GPU chip complexity while providing visual gains similar to those you'd get in 10 years time without h/w RT.

If it wasn't in the context of hybrid rendering perhaps, because it can get rid of a lot of legacy hurdles holding back content creation, but for hybrid rendering it's ultimately just an effect pass at this point. Fidelity of GI is just not that important to me, I'd sooner compromise on that than on say geometric fidelity and animation fidelity. It's become an extra hurdle in fact for content creation by holding back the innovation of better surface representations which was just growing now.

I'd much rather have seen shader invoked AABB/triangle(/slab/ellipsoid) intersection instructions than what we have now.
 
Agreed. Idk why the discussion slipped to just RT again.
Personally i accuse NV to misuse NV to push overspecced data center ML HW to gamers, but that's subjective.
The (off)topic came up because people noticed contra RT echo chambers on the internet. From my perspective, RT is just the scapegoat to explain / justify overspecced and overpriced 'gaming' HW.
Seems some others share this view. Obviously it comes from observing NV marketing and pricing.

Intels RT seems fine to me. Although they follow the same path of fixed function + ML, their marketing and procing positions those features mauch better to the perspective of gamers.
AMDs RT is also fine. They intruduced it gently, and full flexibility would have allowed further progress together with software devs. But unfortunately this door was already closed becasue NV was first.
If it's as you say... the market will sort itself out. If there's a "right way" to introduce RT, and a "wrong way" then consumers will respond to it as they will, and the same goes for developers. Every new technology has faced this type of dilemma where one side believes in one way, and the other believes in another way. Both have merit. Intel following Nvidia's lead with Arc gives merit to what Nvidia started.

And I believe that Nvidia is always first... because Nvidia actively pushes the industry forward. That's what leading companies do. That's the perk of being in that position.. is that you get to influence the industry to go in the direction you see fit. If a direction is clearly wrong, then markets will eventually sort themselves out. If AMD and Nvidia switched positions... AMD would be doing the same thing.. let's not kid ourselves.

I think the RT push has come at the right time. Now with Intel making a strong competitive push with Arc for the low-end/mid-range segments, it further validates the technology for all gamers.

I'm really interested in seeing how Arc evolves over time, as well as how they eventually compete in the high end.
 
That's completely wrong. Please check the real numbers yourself.
I did. Did you?
GTX Titan Z was launched 10 years ago at MSRP of $3000.

If it wasn't in the context of hybrid rendering perhaps, because it can get rid of a lot of legacy hurdles holding back content creation, but for hybrid rendering it's ultimately just an effect pass at this point. Fidelity of GI is just not that important to me, I'd sooner compromise on that than on say geometric fidelity and animation fidelity. It's become an extra hurdle in fact for content creation by holding back the innovation of better surface representations which was just growing now.

I'd much rather have seen shader invoked AABB/triangle(/slab/ellipsoid) intersection instructions than what we have now.
I'm not sure how h/w RT is preventing any of what you're describing.
 
Which time of the several times HDR lighting has been introduced "for the first time" are you talking about in this case? Half Life Lost Coast, which many account as first game with HDR lighting, had notable performance drop from it but pretty sure it wasn't in same scale as RT performance loss is today (which of course again differs from game and card to the next)

Far Cry 1.3 patch.

HDR could half your GPU performance.
 
Well, posted just some week ago, but will research again, using TechPowerUp.
You said constant prices over 10 years. But i'll look at gaming GPUs, not at Titan models, which are not relevant and do not exist anymore to compare. (i wish they did).

GTX 480: (2010): 500$
GTX 680: (2012): 500$ <- 10 years agao
GTX 980 (2014): 550$
GTX 1080 (2016): 600$
RTX 2080 (2018): 700$ RT gave the biggest jump til yet
RTX 3080 (2020): 700$ No jump? Jensen knew it would cost much more on streets, so...
RTX 4080 (2022): 1200$ <- we are here now. biggerst jump ever.

It's not constant. It's times 2.4, to be precise.
Well above inflation, and well above price increase on consoles.
And Moores Law was alive during that time, while now it is dead, he says.
Which means the exponential curve is meant to continue in the future.
So prepare to change your mind, or to loose your job.
 
But i quickly realized it is a good idea within few days. Affordable, good enough, easy entry, sustained business.
Then please stick to that, don't demans progress or higher quality graphics or abythi graphics else.

We enthusiast PC gamers are the opposite here, we want to run games at their highest fidelity, highest effects, best fps, we even mod our games to add more graphics to old titles, we run old titles using extreme levels of AA or super resolution, we do it all, and we want more, we have several orders of magnitude more power hardware than consoles, and we want it to work to achieve that goal.

So what has been happening during the last 7 to 10 years? Stagnation, developers have done nothing but deliver us console ports with the bare minimum of visual upgrades .. 4k visuals? I've had 4K on PC for years before any deveopers started branding their graphics 4K! Unlocked AF to 16X, pathetic increases in draw distance, pathetic increases in the resolution of shadows or lighting or reflections, all pretty substandard stuff, to the poibt that Ultra settings became a joke, our hardware is setting idle running games that don't look any different than consoles, games that often are fps locked, or CPU limited in most scenes. Our massive GPUs and massive mult core CPUs are doing practically nothing.

Ray Tracing changes that, PC games can finally get this extra layers of immersion they so badly needed, reflections can go beyond the pathetic resolution bumb, can be dynamic and accurate and scene wide with diminshed ugly screen space artifacts, shadows can now be numerous in numbers, with small objects casting and recieving shadows, lighting can be massively more dynamic, responding to the actions of the player .. etc.

This isn't about cost, this is about the soul of PC gaming, a soul that is thriving more than ever now with the capabilities of the hardware being put to use effectively more.

If the soul of PC gaming is not important to you, the consoles should serve you more.
 
GTX 480: (2010): 500$
GTX 680: (2012): 500$ <-
Let me stop you right there. Marketing names are completely irrelevant. Nowhere does it say that a "4645680” must cost the same as "480".
The only thing which matter is the range in which GPU products are available. And while you could argue that we've lost the lower end (to iGPUs, which are basically free) the mid and high ranges are very much the same as they were for the last ten years.
 
Well, then i can't help you, guys.
I'll leave enthusiast club now, and agree this discussion should be moved to the ignorant and decadent sections... ; )
Intel does not deserve to be tainted from the mistakes of their rivals.
 
Personally i accuse NV to misuse NV to push overspecced data center ML HW to gamers, but that's subjective.
NVIDIA is the only manufacturer left who is always pushing the boundaries for PC gaming, it's for their own benefits of course, but they are the only ones left standing.

Intel is dead in the water, has been for a long time, AMD is content serving the console market, and are happy shoving console ports down PC gamer's throats, with the absolute bare minimum visual improvements, the only who is left caring is NVIDIA.

Who introduced Hardware T&L first? NVIDIA, pixel shaders? Shader Model 3? Unified shaders? NVIDIA .. GPGPU? Physics on GPUs? better Ambient Occlusion? Better Shadows? Multi threaded micro-polygon processing? TXAA (the predecssor of TAA)? G-Sync? Faster VR? Ray Tracing? AI upscaling? AI frame generation? You guessed it .. NVIDIA.

They are doing for themselves, and they are doing it expensively in many cases, but they are doing none the less, others are twidling their thumbs.

So hate them all you want, they are pushing the boundaries their way, and you simply don't get to crticise pushing the boundaries constantly when the contributions of others are so low they are not even on the map for PC gamers.
 
Status
Not open for further replies.
Back
Top