AMD: RDNA 3 Speculation, Rumours and Discussion

Status
Not open for further replies.
True but sad at the same time, see all that ray tracing functionality in the consoles sitting unused for their entire existance. On pc there will most likely atleast be indie/mod teams that use ray tracing like Q2RTX, vice city ray tracing etc etc, so users see some applications using ray tracing atleast.
Even more sad is that all hardware you buy today has ray tracing functionality in it, be it amd, intel, consoles, nv etc. Already on a 2080Ti, might sell it and get next gen nv or amd/intel gpu where they will have abandoned ray tracing alltogether most likely in favour of normal rasterization.
 
Can any GPU do full pathtracing for GTA3-level graphics? Even in Q2PT framerates are kinda iffy with $1000+ GPUs (and it's heavily denoised etc as far as I understood)

No idea. The idea that my PS5 and 2080ti have tech in them that will go unused isnt all that bright. They could have resources on further improving normal rendering perhaps.
 
Non-RT performance will be very much irrelevant in 2-3 years.

There’s always the 360Hz monitor crowd to think about.

6900XT is 58% faster than 6600XT at 1080p:

Sapphire Radeon RX 6600 XT Pulse OC Review - Average FPS | TechPowerUp

I really don't think AMD is going to put as many as 4096 SIMD ALU lanes in it. It doesn't make sense for 1080p at around 3GHz. Merely putting in twice as much Infinity Cache (i.e. 64MB for 7600XT) would cut that advantage for 6900XT substantially.

Basically, the triangles have too few pixels on them!

This is just based on Nvidia architectures but it seems intra frame workloads are both very bursty and also suffer from very low hardware utilization. So having “too many ALUs” is still helpful to quickly plow through those bursts of activity.

If games made more efficient use of GPU hardware we would need more cache and vram and much less of everything else.
 
Can any GPU do full pathtracing for GTA3-level graphics? Even in Q2PT framerates are kinda iffy with $1000+ GPUs (and it's heavily denoised etc as far as I understood)
This is why I don't think RT performance is so important for games releasing in 2022 and 2023. The performance hit for the using RT that actually matters and can't be satisfyingly replaced by rasterization tricks is too great even with all the heavy use of upsampling techniques (which have their own shortcomings).

The RT-less Kena looks infinitely better than RT Minecraft to 99.9% of the people.


To be pedantic you mean games using ray tracing, they are not ray traced, they are rasterized with some ray tracing. Ray tracing isn't the sole method of rendering (not even the primary one).
To be pedantic^2, we mean games using DXR hardware acceleration. There are games/engines using software ray tracing:
https://docs.unrealengine.com/5.0/en-US/RenderingFeatures/Lumen/TechOverview/#softwareraytracing
 
So having “too many ALUs” is still helpful to quickly plow through those bursts of activity.
Nope, it's exactly the opposite.
Having more threads in flight for more SIMDs won't do any good for burst workloads, burst workloads are usually small ones (few threads) and benefit from higher frequencies rather tham from more ALUs.
Wider GPUs suffer from underutilization on narrow burst workloads with a few threads.

If games made more efficient use of GPU hardware we would need more cache and vram and much less of everything else.
Good renderers are architected to be math bound, not vram, because bandwidth is the most scarce resource (especially on consoles) with the worst scaling.

Basically, the triangles have too few pixels on them!
Should not be a problem with deferred shading, which dominates in modern games.
I would rather believe geometry scaling is the reason of worse occupancy on high end GPUs in 1080p + many games are obviously CPU bound in 1080p on RX 6800 XT

Can any GPU do full pathtracing for GTA3-level graphics?
There is the Marbles demo with pathtracing and higher poly assets than in modern games, but why do you need full path tracing right now in the first place?

The RT-less Kena looks infinitely better than RT Minecraft to 99.9% of the people.
Very flawed argument.
 
There is the Marbles demo with pathtracing and higher poly assets than in modern games, but why do you need full path tracing right now in the first place?
Because it's cool? Something to justify spending extra even if I don't really need it. As a consumer, I don't see any reason to sacrifice a substantial % of performance (and I'm one of the "280 Hz gaming crowd) for a barely noticeble effect that can be faked in a way that was 100% acceptible a few years ago. With current prices of GPUs, there's even less reason why we should subsidize nV/AMD/Intel/etc efforts to go back in time and use fixed-function hardware to solve very narrow tasks.
 
Should not be a problem with deferred shading, which dominates in modern games.
It appears that deferred engines still suffer from small triangles, which is why Nanite exists and why there is a lot of work going in to visibility buffer techniques. These articles are superb:

Visibility Buffer Rendering with Material Graphs – Filmic Worlds

I would rather believe geometry scaling is the reason of worse occupancy on high end GPUs in 1080p + many games are obviously CPU bound in 1080p on RX 6800 XT.
I would tend to agree, but I suspect that's really a problem with older games/engines.

Oh common guys, please don't derail another thread with this "RT is holly grail vs. RT is useless" nonsense. Thx
There's no derailing going on here. We're talking about future priorities in a GPU architecture.

I raise the point of ray tracing specifically because AMD is far behind and NVidia will not stand still. The "performance penalty" of ray tracing at 1080p for $400 is going to look fine in 1-2 years' time, at least with NVidia. 2080Ti++ performance at 1080p.
 
Because it's cool?
Well, such explanation doesn't make a lot of sense.
You may want to add it if you want fully dinamic GI with day-night cycle and destructable buildings/etc, if you want subpixel geometry, if you want unifyed lighting solution without tweaking baked assests and scene lighting to death, if you want hightly customizable area lights with variable penumbra shadows, millions of lightsources in frame, etc, etc.
In fact, most of these things have already been implemented in engines with hybrid RT, such iterative development does make sense unlike sudden desire to move to path tracing just because it's "cool".
When devs are done with replacing rasterization based hacks and tricks with way more robust RT shadows/GI/etc systems, then they will start moving to the unified RT solutions, such as PT.

It appears that deferred engines still suffer from small triangles
They don't suffer during shading phase because deferred rendering does shading on full screen quad - no small fragments.

which is why Nanite exists and why there is a lot of work going in to visibility buffer techniques
Nanite exists to accelerate G-Buffer fill phase, i.e. rasterizing pixel sized triangles where HW rasterizers suffer, shading phase happens later on and has nothing to do with Nanite since it's decoupled.
UE5 still uses deferred rendering to shade the majority of objects on screen, visibility buffer is used to decouple draw calls from shaders so that Nanite geometry can be drawn with a few execute indirect calls, also visibility buffer is reused to shade custom materials later on.

I would tend to agree, but I suspect that's really a problem with older games/engines.
I guess it does not matter whether an engine is old or new, if it does tons of small draws and dispatches with 1*1*1 grids, it will work like ass on wide GPUs (on consoles that stuff is usually hidden in async, but that's not always the case on PC).
 
Well, such explanation doesn't make a lot of sense.
As you said it yourself, there's "iterative development". I'd say let's wait until this "development" is fully fleshed out, because current games are definitely not 'ray-traced' as the people who yet again derail this thread claim for the upteenth time...

When devs are done with replacing rasterization based hacks and tricks with way more robust RT shadows/GI/etc systems, then they will start moving to the unified RT solutions, such as PT.
Here we go back to the coolness factor - if you can do 99% of fidelity of the fully raytraced image with rasterization, why replace it with something that will require 100-1000x more processing power (and unrealistic mem bw) unless you believe that such 'honest' rendering is way more cooler than rasterisation?
 
if you can do 99% of fidelity of the fully raytraced image with rasterization
You can't do that.
Rasterization tricks require prebaking to handle unsupported stuff and prebaking limits what you can do with rasterization - prebaking stuff works only for deterministic systems == static environments, no destruction, no physics, etc., not even speaking of increased development time and costs.
So, aside from making games less realistic in terms of lighting (no area lights, limited number of lights, etc.), shadows (no area lights, no variable penumbra, fixed resolution, no transperencies support, etc) and GI configurations (very low frequent lighting and light leaks with probes, static scenes, no destruction, etc), rasterization also increases development cycles and limits development in terms of what can be done to configure your scenes and lighting.

why replace it with something that will require 100-1000x more processing power (and unrealistic mem bw) unless you believe that such 'honest' rendering is way more cooler than rasterisation?
Because it doesn't require 100-1000x more processing power, because there are no 1000s of ways to render realistic graphics, and because Dennard scaling is dead, so industry will support RT anyway.
RT is a well studied and proven way of rendering photo realistic images. Besides of obvious benefits of improving graphics quality way further, it will help with development costs/cycles, will maximize flexibility and will simply enable games with more interactivity than ever.
 
Because it doesn't require 100-1000x more processing power
It does require well over 10x more processing power than whatever we can currently buy for less than $500, or >4x what the 9th-gen consoles can do (which always end up becoming the de facto baseline), which is why adoption is slowing down on AAA studios.


it will help with development costs/cycles
Nah.
Maybe on the handful of titles where Nvidia subsidizes the implementation with their own man-months, which is probably what happened with Cyberpunk, Metro and some others. It's still a small sample though.
 
Status
Not open for further replies.
Back
Top