AMD Radeon RDNA2 Navi (RX 6500, 6600, 6700, 6800, 6900 XT)

Current or previous gen?
Current gen will get more DXR enabled games as time goes on.
Previous gen without DXR will fair much better on RDNA2.
I think its more likely RT sees minimal use. I suspect shadows will be the most common application.
 
I think its more likely RT sees minimal use. I suspect shadows will be the most common application.

That would be ideal for AMD as their DXR solution doesn't really do good (usable) outside DXR shadows.
But NVIDIA will push for more...and you can, as some games already have implemented, have various DXR fidelity settings.

This kinda reminds me of the tessellation debate a few years ago...here is hoping we don't get driver toggles that reduce the image quality, but that games will implement DXR settings instead so it is up to the end-user.
 
That would be ideal for AMD as their DXR solution doesn't really do good (usable) outside DXR shadows.
But NVIDIA will push for more...and you can, as some games already have implemented, have various DXR fidelity settings.

This kinda reminds me of the tessellation debate a few years ago...here is hoping we don't get driver toggles that reduce the image quality, but that games will implement DXR settings instead so it is up to the end-user.
Ya there will be the rare game that Nvidia pushes extra RT effects in. Consoles aren’t performant enough for devs to devote much RT IMO.
 
Last edited:
Ya there will be the rare game that Nvidia pushes extra RT effects in. Consoles aren’t performant enough for devs to devote much RT IMO.
I expect that it will be a layered/hybrid approach.
So I expect RT to be used on more than just shadows.

So you could benchmark with the settings dialled down to make it totally apples to apples comparison, sure, but I don't think that's fair or useful either.
Unlike hair works etc it's not proprietary so is fair to benchmark with it on. So I'm also expecting more work from benchmarkers in the short term with RT effects on and off.

When is RDNA3 due as I expect RT to be more performant on it.

The thing consoles have done is its made it necessary to have a wide range of flexibility and settings in this respect. From XSS - RTX3080
 
I expect that it will be a layered/hybrid approach.
So I expect RT to be used on more than just shadows.

So you could benchmark with the settings dialled down to make it totally apples to apples comparison, sure, but I don't think that's fair or useful either.
Unlike hair works etc it's not proprietary so is fair to benchmark with it on. So I'm also expecting more work from benchmarkers in the short term with RT effects on and off.

When is RDNA3 due as I expect RT to be more performant on it.

The thing consoles have done is its made it necessary to have a wide range of flexibility and settings in this respect. From XSS - RTX3080
I think it’s very unlikely there is widespread use of RT in the PC space above and beyond what the console version incorporates. Youll be able to dial up the resolution/ray count with ultra settings but expecting more is likely setting yourself up for disappointment. 15 years of multiplatform games with only the super rare Nvidia funded gameworks title having extra tech above the console versions. Game development costs are only going up. Theres no reason for this trend to change.
 
Ya there will be the rare game that Nvidia pushes extra RT effects in. Consoles aren’t performant enough for devs to devote much RT IMO.

I do not hope that consoles will hold back progress for the next 6-7 years.
CyberPunk 2077 is doing the opposite though...maximum fidelity on the PC...scaled down fidelity on consoles...consoles seem "obsolete" on launch this time.

Again, a simple in-game DXR setting would solve the problem of the consoles poor DXR performance.

I wonder what will be an issue first...10 vs 16 GPU RAM...or DXR performance.
 
I do not hope that consoles will hold back progress for the next 6-7 years.
CyberPunk 2077 is doing the opposite though...maximum fidelity on the PC...scaled down fidelity on consoles...consoles seem "obsolete" on launch this time.

Again, a simple in-game DXR setting would solve the problem of the consoles poor DXR performance.

I wonder what will be an issue first...10 vs 16 GPU RAM...or DXR performance.
Consoles don't and haven't held back progress for ages. Cyberpunk is Nvidia funded.
 
Last edited:
Not true, The Witcher 3 suffered due to console...they even stated so public:
CD Projekt tackles The Witcher 3 downgrade issue head on • Eurogamer.net

I know it is "dangerous" to critize consoles, but they do not add to progression the way their fans would like them too IMHO.
Quite the opposite as I see it.

PC exclusive games exist and look vastly worse than consoles games. Consoles and their large userbase is what allows so much money to be poured into AAA games and the visuals they deliver. The addressable market with a system more powerful than these new consoles will be miniscule for several years. Witcher 3 would never have looked like the early footage even if it was a PC exclusive, it likely would have looked far worse than what it ended up as. No developer is going to design a game around 700$+ GPUs. That ended with Crysis.
 
PC exclusive games exist and look vastly worse than consoles games. Consoles and their large userbase is what allows so much money to be poured into AAA games and the visuals they deliver. The addressable market with a system more powerful than these new consoles will be miniscule for several years. Witcher 3 would never have looked like the early footage even if it was a PC exclusive, it likely would have looked far worse than what it ended up as. No developer is going to design a game around 700$+ GPUs. That ended with Crysis.

From what I heard from a dev, last gen was not as powerful as the devs were thinking it will be and devs overspecced some games at the beginning. This is one reason Unreal Engine 4 first demo had SVOGI or AC Unity was too ambitious for Jaguar CPU.

This time I heard before having devkit, the target for some devs were Vega 64 and zen CPU. They are happy of the performance of the new consoles from a CPU, GPU and RAM point of view. If the devs decide PS5 and XSX are the minimum requirement for games, this will let many people needing to change of PC.

But this is a good thing, it shows consoles doesn't have a very weak component like the Jaguar CPU on PS4 and XB1.

EDIT:
the console and PC "war" is ridiculous. AAA games sold a lot on consoles but people who want more power can play on PC.
 
Last edited:
I think it’s very unlikely there is widespread use of RT in the PC space above and beyond what the console version incorporates.

We’re already seeing RT used for shadows and reflections in cross gen console games. There’s no reason to think both effects won’t be widespread in next gen exclusives.

Also once you’re already doing all the work to build the RT pipeline and BVH for shadows the incremental work to cast reflection or GI rays shouldn’t be that daunting. I fully expect RT to see widespread use on consoles and expanded usage in PC hardware especially as the PC upgrade cycle rolls on.
 
We’re already seeing RT used for shadows and reflections in cross gen console games. There’s no reason to think both effects won’t be widespread in next gen exclusives.

Also once you’re already doing all the work to build the RT pipeline and BVH for shadows the incremental work to cast reflection or GI rays shouldn’t be that daunting. I fully expect RT to see widespread use on consoles and expanded usage in PC hardware especially as the PC upgrade cycle rolls on.

I expect GI to use other methods like on Unreal Engine 5 and CryEngine(SVOGI) or Demon's soul's(froxel equivalent to RTXGI based on probes) and use HW acceleration for non triangle base RT. I think most of the game will use shadows or reflection or no triangle based RT at all and this is logic AMD did not dedicate the same area than Nvidia for triangle RT.

The reason is simple performance.

EDIT: Reflection will stay in games in a city for example.
 
Last edited:
I expect GI to use other method like on Unreal Engine 5 and CryEngine(SVOGI) or Demon's soul's and use HW acceleration for non triangle base RT. I think most of the game will use shadows or reflection or no triangle based RT at all and this is logic AMD did not dedicate the same area than Nvidia for triangle RT.

The reason is simple performance.

EDIT: Reflection will stay in games in a city for example.

Yeah I’ll believe that when I see it. We’ve been hearing about SVOGI for 8 years now with no uptake. RT has seen much faster adoption in comparison.
 
Yeah I’ll believe that when I see it. We’ve been hearing about SVOGI for 8 years now with no uptake. RT has seen much faster adoption in comparison.

CryEngine use SVOGI this is not a news. And SVOGI was not used because of last generation console.
 
Not true, The Witcher 3 suffered due to console...they even stated so public:
CD Projekt tackles The Witcher 3 downgrade issue head on • Eurogamer.net

I know it is "dangerous" to critize consoles, but they do not add to progression the way their fans would like them too IMHO.
Quite the opposite as I see it.
A quick scan through also highlights there wouldn't even be a game without a console version.
They also talk about other things that needed to be changed and not just because of consoles.

It's easy to forget that PC is the ultra high end but it's also the ultra low end and everything in between.
So PC's are just as capable of holding itself back, especially if studio needs to have a large enough market to sell to.
We don't even know the performance of intels RT implementation is yet.

I'm just so happy that the consoles are a lot more balanced this time around, and it's the graphics that we are talking about having to make compromises and scale on.
As that's where engines need to be made to scale anyway. From low PC performing RT hardware to high.
 
Back
Top