AMD Radeon RDNA2 Navi (RX 6500, 6600, 6700, 6800, 6900 XT)

@DavidGraham So they're just using the RT hardware to accelerate what is typical screen space shadows? It's kind of curious. I'm guessing that keeps the ray tracing very cache friendly and minimizes the bvh because you'd only really have to build a bvh for what's in the players view frustrum. I'll check the video out.

Meaning build the BVH after culling geometry? That doesn’t sound right and would result in more than just artifacts. Some shadows would be completely missing.

Godfall uses a very minimalistic approach to RT, where the shadows are ray traced in screen space and not off screen, the end result is artifacts whenever an object hides the shadows, in short it's acting like Reshade RT post processing.


My first reaction was whoa WCCF actually produces unique content!

Kidding aside it’s not clear what’s happening. Hopefully just be a driver bug. Hard to imagine the developers would use an incomplete version of the scene to generate raytraced shadows. That would defeat the entire point of using RT in the first place.
 
Last edited:
In professional RT apps, AMD is achieving some comparatively modest gains that are no where near as high as NVIDIA, on avg AMD is doing 2X the performance, whether comparing hardware acceleration on vs off on a 6800XT, or when comparing 6800XT vs 5700XT in general, the 6800XT os vastly more powerful than 5700XT, yet it achieves only 2X the speed up with the acceleration.

 
The missing shadows are from geometry that isn’t occluded and is still on screen so the screen space theory doesn’t hold.

you can see shadows from grass and rocks disappearing as the object pans off the edge of the screen. It does look something like a screen space effect.

Edit: THere is a possibility that there are a number of shadowing techniques at play, and the shadows I'm seeing pop in at the edges of the screen are not where they're using RT for shadows. Those could be screen space shadows and the DXR shadows could be elsewhere in the scene.
 
Last edited:
you can see shadows from grass and rocks disappearing as the object pans off the edge of the screen. It does look something like a screen space effect.

Edit: THere is a possibility that there are a number of shadowing techniques at play, and the shadows I'm seeing pop in at the edges of the screen are not where they're using RT for shadows. Those could be screen space shadows and the DXR shadows could be elsewhere in the scene.

Classic shadow mapping is done in world space, just from the perspective of the light and not the camera. I’m not sure what screen space shadows even means.
 
Classic shadow mapping is done in world space, just from the perspective of the light and not the camera. I’m not sure what screen space shadows even means.

Well, they look like a screen space effect in that the shadow for an object will not appear until the object appears on screen, and the shadow for an object will disappear if the object moves off screen. Seems to have the same issue as screen space reflections. In terms of how they're doing it, I don't know.
 
It isn't though, take a look. Those shadows are damned smooth, at least as good as Modern Warfare's before it. All at 60 and often at 1800p or above on the consoles.

Could the API difference be as huge as that? If you scaled the relative compute power between a 6800xt and a PS5 the former should be averaging a hundred fps at 1440p, or more.

It's far more likely that there are settings differences between the two which account for the delta and they may not be restricted to RT. Check out the difference individual settings can make in Godfall for example:

https://www.overclock3d.net/reviews/software/godfall_pc_performance_review_and_optimisation_guide/6

And consoles can and often do scale some settings below the PC's lowest setting (see the DF Watch Dogs face off) so direct comparisons to PC presets doesn't necessarily help here either.

While I don't disagree with the comments above about optimisation, we've seem time and time again in face offs that PC GPU's which are still current from a driver support perspective tend to perform very comparably to consoles for a give spec. There are corner cases of course (in both directions) but I don't expect this to be one of them. Hopefully DF will do a face off and we'll learn the truth.
 

DXR 1.1 on RDNA2 by @Rys
I started recording this around the time DX9 was finalised because I knew it would take me that long to record a take without my dogs barking. For one of the takes Ted, my shoutiest, was sat on my lap to help him ssshh, but the powers that be said nah because I was wearing the wrong t-shirt. Could have murdered, it was a solid take and Ted was cute.
 
I started recording this around the time DX9 was finalised because I knew it would take me that long to record a take without my dogs barking. For one of the takes Ted, my shoutiest, was sat on my lap to help him ssshh, but the powers that be said nah because I was wearing the wrong t-shirt. Could have murdered, it was a solid take and Ted was cute.
great video , helped me understand some of the new features. But now please more about Ted....
 
I think this video summarizes pretty well how RT will evolve with today's cards and future games.


Nah, I think the opposite is the case. He is basically saying you won't be able to use Raytracing on a 3080 in 2 years even on the lowest settings which couldn't be more wrong and is so misinformed. Raytracing on medium and low settings will get a lot more optimized in the future thanks to consoles, the DXR 1.1 API and scalable hybrid techniques like RTXGI and it will still look amazing. Theoretically, you should be fine on a 2060 Super at Xbox Series X graphics with Raytracing the whole generation as the 2060 Super is even little more capable as proven by Digital Foundry, it goes without saying that a 3080 is a lot, lot more capable than that. And of course that is without DLSS, so with it even lower end RTX cards should have a very long lifespan.

This part of the video was just complete nonsense.
 
You are assuming that game won't be making any graphical improvement in other areas(which is the main point of the argument). Time will tell who was right but I don't see a 2 year old VGA running a new game max out with RT(that we have to assume will have a lot more effects than today) enable. If devs gets better optimizing RT they will simply add more effect maintaining same performance.

We will see.
 
You are assuming that game won't be making any graphical improvement in other areas(which is the main point of the argument). Time will tell who was right but I don't see a 2 year old VGA running a new game max out with RT(that we have to assume will have a lot more effects than today) enable. If devs gets better optimizing RT they will simply add more effect maintaining same performance.

We will see.

I can see the argument that the consoles will last six years and the relative performance of the 6800 and 3080 won't change during those six years. These cards should easily last for six years if you play at console settings, which will include ray tracing. You just can't expect to run at ultra settings for six years.
 
You are assuming that game won't be making any graphical improvement in other areas(which is the main point of the argument). Time will tell who was right but I don't see a 2 year old VGA running a new game max out with RT(that we have to assume will have a lot more effects than today) enable. If devs gets better optimizing RT they will simply add more effect maintaining same performance.

We will see.
I'm not assuming games won't get more demanding in the future, but I do think the next generational jump will only partly happen through an increase in raw performance. Software innovations like Sampler Feedback, VRS, Mesh Shaders and AI reconstruction (that get completely ignored by HBU once again) will certainly be a huge part of that, as each one of these features allow for much higher performance and fidelity at the same time. The consoles were clearly build in mind with these, as their performance profile indicate. We really don't have a 5x GPU performance increase anymore like last generations.

Do you really think it makes sense to crank up Raytracing on the lowest settings far beyond the consoles? Remember, the consoles will be the baseline in next generation and thanks to the DX12U API, code can be easily shared between Xbox and PC so if it runs great on the consoles it will run much faster on PC GPUs. Yes, devs will certainly use optimizations to further enhance image quality, but that is more likely to be reserved for ultra rather than low. Keep in mind much more people will own a RTX 3050 rather than a RTX 3080 next year, that much is certain, and they certainly want some Raytracing goodness too.

But that is just my take, we will see indeed.
 
The number of waves per SIMD is something that can be modified by the architecture, although I don't see a direct link to a change like that to other functions. The number of waves per SIMD doesn't directly inform what each shader does, and could also be a side-effect of optimizing for the higher clock range of the architecture.
Yes, I can agree with just waves per SIMD doesn't show a direct link - that's why I mentioned multiple elements suggesting Packers are post scan conversion. There are 32 Packers for Navi21, and packing fragments from 1x2 strips up to 2x2 quad sizes (4x32=128) means maximum fragments in flight matches 128 ROPs. The 4 triangle rasterisation per clock has not changed from Navi10, but peak fragments in flight has doubled, so why are you expecting Packers to be pre Scan Conversion?
 
I can see the argument that the consoles will last six years and the relative performance of the 6800 and 3080 won't change during those six years. These cards should easily last for six years if you play at console settings, which will include ray tracing. You just can't expect to run at ultra settings for six years.

The argument is quite simple. WDL gets to 50s FPS on a 3080 with DLSS. The "today cards will get higher FPS with RT in the future as the tech gets optimize" narrative could only be true if games would not progress in any way, which we know wont be the case. games like WDL and CP2077 will be the baseline for next gem RT performance.

I'm not assuming games won't get more demanding in the future, but I do think the next generational jump will only partly happen through an increase in raw performance. Software innovations like Sampler Feedback, VRS, Mesh Shaders and AI reconstruction (that get completely ignored by HBU once again) will certainly be a huge part of that, as each one of these features allow for much higher performance and fidelity at the same time. The consoles were clearly build in mind with these, as their performance profile indicate. We really don't have a 5x GPU performance increase anymore like last generations.

Do you really think it makes sense to crank up Raytracing on the lowest settings far beyond the consoles? Remember, the consoles will be the baseline in next generation and thanks to the DX12U API, code can be easily shared between Xbox and PC so if it runs great on the consoles it will run much faster on PC GPUs. Yes, devs will certainly use optimizations to further enhance image quality, but that is more likely to be reserved for ultra rather than low. Keep in mind much more people will own a RTX 3050 rather than a RTX 3080 next year, that much is certain, and they certainly want some Raytracing goodness too.

But that is just my take, we will see indeed.

you said that consoles will be the baseline and that tech will be develop for them but you talk about tech the those consoles doesn't have...All of those tech are in fact quite impressive but we don't have them yet. How many games use DLSS2.0?

I think ur missing the point here. a 3080 will be good in 2 years to play games with rasterization but RT will get more and more complex and more and more effects. Even with DLSS you won't have the performance to use must of those effects. So the "future proof" RT won't happen.

Also one have to argue if using RT at low settings is actually worth the performance cost of it. Looking at todays game I think it is. but maybe in the future it would be...but I doubt it.
 
Last edited:
Back
Top