GPU Ray Tracing Performance Comparisons [2021-2022]

8 pixel triangles sure looks like overtessellation for some cases. Especially in a game released 10 years ago.
8px triangles in a game released 10 years ago would actually be pretty big considering the most widespread display resolution back at that time.
One could argue that 8px triangles aren't optimal for a GCN GPU though, from any point of view.
But that's again down to GPU architectures and not "overtessellation" or some such bs.
 
Transparent reflections are a small part of everything a game renders. Improving them significantly does not necessarily guarantee a large improvement to the net visual experience. No one sane expects RT to fix everything, that's our point. There is an extremely vocal group here who get extremely upset when someone doesn't agree with their RT be all end all views. You also can't ever really separate the performance aspect until performance is no longer a concern.

Yeah just like volumetrics and particles are each individually a small part. By everything I assume you’re referring to geometry and textures because those are the most prominent components of a render. Clearly there aren’t reflections in every scene. However when they are present they can be very prominent and completely transform the look of a game. If the bar is that an effect has to be in your face at all times then very few things will meet that standard. You don’t even need GI for sunny outdoor scenes with a single light source to look really good.
 
8 pixel triangles sure looks like overtessellation for some cases. Especially in a game released 10 years ago.
Why would there be 8 pixel large triangles in concrete slabs?

mY7JLe9.jpg

The setting says “min” triangle size. I assume that means many triangles are bigger than 8px.
 
Yeah just like volumetrics and particles are each individually a small part. By everything I assume you’re referring to geometry and textures because those are the most prominent components of a render. Clearly there aren’t reflections in every scene. However when they are present they can be very prominent and completely transform the look of a game. If the bar is that an effect has to be in your face at all times then very few things will meet that standard. You don’t even need GI for sunny outdoor scenes with a single light source to look really good.
I don't think ratcheting up particles or volumetric to an ultra setting transforms a games visual experience either. It refines the image. Everything includes geometry, textures, lighting, materials, animation etc. There is no specific bar, it's subjective. My and others opinion is that in most cases, the incredibly steep performance cost of RT in most of today's games isn't justified by the visual return. I think shadows and reflections are a sub optimal use of RT with todays visual limitations and the performance level of current tech.
 
I don't think ratcheting up particles or volumetric to an ultra setting transforms a games visual experience either. It refines the image. Everything includes geometry, textures, lighting, materials, animation etc. There is no specific bar, it's subjective. My and others opinion is that in most cases, the incredibly steep performance cost of RT in most of today's games isn't justified by the visual return. I think shadows and reflections are a sub optimal use of RT with todays visual limitations and the performance level of current tech.

Fair enough. Fortunately RT can be turned off in nearly every game that uses it so the tech can continue to advance while giving people the option to turn it off where they don’t see the value.
 

As expected, RDNA is one gen behind Nvidia in RT titles. Not great but not bad either. I can enjoy some RT games in their full glory on my 6800XT at 1440p, maybe apart from CP2077 where I need to go down to 1080p for fluid gameplay. My RTX3080 is faster there by a decent margin. Wonder what will next gen bring WRT RT, but I expect substantial jumps from both sides ... oh, and we will have 3rd player to flex their RT muscle by then too :D
 
This french site does an extensive RT analysis using Turing, Ampere and RDNA2 GPUs, using an average of 11 RT games (which include Godfall and RE8), the 2080Ti roughly matches the 6900XT, and the 3090 is 56% faster than the 6900XT.

https://www.comptoir-hardware.com/a...-test-nvidia-geforce-rtx-3070-ti.html?start=5

What's interesting however, is how much faster Ampere appears to be against Turing in path traced games, @4K the 3070 is ~30% faster than 2080Ti in Minecraft, and 25% faster in Quake 2 RTX.

https://www.comptoir-hardware.com/a...-test-nvidia-geforce-rtx-3070-ti.html?start=5
 
Last edited:
Fair enough. Fortunately RT can be turned off in nearly every game that uses it so the tech can continue to advance while giving people the option to turn it off where they don’t see the value.
Just a question of how long it remains that way. I suspect in the future Raytracing might play a more integral role in engine lighting, replacing techniques like SSR and regular GI as a standard.

At some point, the benefit of decreased development time and cost as well as the overall improvements to visual fidelity will outweight the complains of people running older hardware. Especially since PS5, XboxSeries and modern HW-RT capable GPUs like Turing, RDNA2 and Ampere increasingly get more and more traction. Plus, you can still support older hardware by running software RT so nobody gets left behind.

This is the strategy I expect the industry to follow rather soon. Infact, we already have a confirmed title that does exactly this. The next gen exclusive Avatar game. Assuming the release date remains 2022, that would be next year already, which is faster than some people might have expected.
 
Last edited:
I think even Turing is comparing favourebly against RDNA2 when RT is extensively used.
I'm not entirely convinced as future games will keep in mind RDNA2 stronger and weaker points.
We did that investigation some time ago in Quake 2 RT and there are parts of RT pipe where AMD is as good as, and other where it lags even behind Turing. That is in a pure RT game, so almost a worst case scenario, but of course you can build a game to extensively use effects particularly slow on AMD and show disproportionate advantage that way.

Anyway, the games I currently care for like Metro, Quake and Control all run just fine on high end overclocked Navi21, and for CP2077 I can always switch to 3080 ;)
 
I'm not entirely convinced as future games will keep in mind RDNA2 stronger and weaker points.
We did that investigation some time ago in Quake 2 RT and there are parts of RT pipe where AMD is as good as, and other where it lags even behind Turing. That is in a pure RT game, so almost a worst case scenario, but of course you can build a game to extensively use effects particularly slow on AMD and show disproportionate advantage that way.

Anyway, the games I currently care for like Metro, Quake and Control all run just fine on high end overclocked Navi21, and for CP2077 I can always switch to 3080 ;)

Turing is a very competitive GPU arch to RDNA2 atleast. Would say that indeed, RDNA2 can match Turing but when theres a difference its turing that generall pulls ahead (in RT perf).
If were talking high end RDNA2 like 6800/XT and upwards you got enough power to play with indeed. Anyway to be honest, i do think its worth to note that RDNA3+ will absolutely fair better in ray tracing as it will be AMD's second iteration which obviously will see an improved implementation (akin to Intel and NV perhaps).
 
RDNA3 could pull far ahead of Ampere's Raytracing with their MCM design. It's going to get really interesting! The potential is huge.
 
Turing is a very competitive GPU arch to RDNA2 atleast. Would say that indeed, RDNA2 can match Turing but when theres a difference its turing that generall pulls ahead (in RT perf).
If were talking high end RDNA2 like 6800/XT and upwards you got enough power to play with indeed. Anyway to be honest, i do think its worth to note that RDNA3+ will absolutely fair better in ray tracing as it will be AMD's second iteration which obviously will see an improved implementation (akin to Intel and NV perhaps).
Meanwhile in the real world RDNA2 is giving Ampere, not Turing, run for it's money in all but RT
 
Meanwhile in the real world RDNA2 is giving Ampere, not Turing, run for it's money in all but RT
Actually in the real world RDNA2 is still lacking in professional applications in comparison to Ampere, possibly also Turing.
 
Just a question of how long it remains that way. I suspect in the future Raytracing might play a more integral role in engine lighting, replacing techniques like SSR and regular GI as a standard.

It would be great if SSR is relegated to “low” settings with RT reflections taking over at medium or higher but that’s probably not happening until the the next console generation.

This is the strategy I expect the industry to follow rather soon. Infact, we already have a confirmed title that does exactly this. The next gen exclusive Avatar game. Assuming the release date remains 2022, that would be next year already, which is faster than some people might have expected.

Yeah hopefully Avatar pushes the boundaries of what the new consoles can do. It’s not clear how much more rendering juice developers will be able to squeeze out of the new consoles in the next few years.
 
Back
Top