GPU Ray Tracing Performance Comparisons [2021-2022]

In 1440p the 4090 is twice as fast as the 7900XTX, 4080 is 46% faster.
Witcher 3 Max RT settings 1440p native:

7900XTX: 30 fps
3090Ti: 33 fps
4080: 44 fps
4090: 60 fps

 
Last edited:
Has anyone done a breakdown of the performance impact of each RT effect separately?

I'm sure that's coming from Alex. What I want to know is of the RT performs reasonably at lower settings (e.g. the co console equivalents).

I have no problem with settings being available that crucify performance if they bring a big visual return provided its still possible to get a great presentation at lower settings on much more mainstream devices. And the nominal (but not only) measure for that for me would be whether the console level settings perform similarly to the consoles on similar hardware. As long as that holds then I want the devs to scale up meaningful settings as far as they can even if that does require a 4090 to run them well.

For me that's the whole point of PC gaming. If you could max everything out on the 3060 then that'd be boring.
 
Problem is in scene very "lowly threaded", your gpu is not doing a lot and still the perfs tanks.

But yeaj I hope DF can take the time to test every scenario.
 
^^ there are no lower settings. Its either on or off for ray tracing. The GI is the most expensive one, the other ones much lower. Its also the most visually impressive, depending on the area of course. Then shadows make the largest visual impact. AO has near zero performance hit from i saw. I would say outside of cities, in the nature, GI and shadows make the most visual impact. In the cities, AO might be more noticeable, but i didnt bother to check it individually.

The performance in novigrad and other cities is atrocious.



AO vs No AO and no shadows


I would not replay this game unless you have a 4090, it just runs too bad
 
I'm sure that's coming from Alex. What I want to know is of the RT performs reasonably at lower settings (e.g. the co console equivalents).

I have no problem with settings being available that crucify performance if they bring a big visual return provided its still possible to get a great presentation at lower settings on much more mainstream devices. And the nominal (but not only) measure for that for me would be whether the console level settings perform similarly to the consoles on similar hardware. As long as that holds then I want the devs to scale up meaningful settings as far as they can even if that does require a 4090 to run them well.

For me that's the whole point of PC gaming. If you could max everything out on the 3060 then that'd be boring.

Akin to the Crysis days :)
 
I'm sure that's coming from Alex. What I want to know is of the RT performs reasonably at lower settings (e.g. the co console equivalents).

I have no problem with settings being available that crucify performance if they bring a big visual return provided its still possible to get a great presentation at lower settings on much more mainstream devices. And the nominal (but not only) measure for that for me would be whether the console level settings perform similarly to the consoles on similar hardware. As long as that holds then I want the devs to scale up meaningful settings as far as they can even if that does require a 4090 to run them well.

For me that's the whole point of PC gaming. If you could max everything out on the 3060 then that'd be boring.
RTXGI in Witcher and Warhammer doesn't scale at all, which is a big problem. And in W3 you cannot turn reflections on without RTXGI either.

Something needs to be done about RTXGI, it's that bad. It sure does look better but a frame drop from 154 fps to 65 is unheard of any single graphical feature. And let's keep in mind there are other RT GI solutions that are far more efficient than this, like Lumen and the one in Metro.

This is bad programming from Nvidia, full stop. Probably intentionally to sell their Ada GPUs as you are dependend on frame generation. If you take a look at previous RTXGI implementations before Ada released, in those it performs far, far, FAAR better. This hurts PC gaming as a whole. This could get as bad as stutter struggle if we do not act now.

Someone needs to call them out for this. I wish it would be @Dictator
 
Last edited:
RTXGI in Witcher and Warhammer doesn't scale at all, which is a big problem. And in W3 you cannot turn reflections on without RTXGI either.

Something needs to be done about RTXGI, it's that bad. It sure does look better but a frame drop from 154 fps to 65 is unheard of any single graphical feature. And let's keep in mind there are other RT GI solutions that are far more efficient than this, like Lumen and the one in Metro.

This is bad programming from Nvidia, full stop. Probably intentionally to sell their Ada GPUs as you are dependend on frame generation. If you take a look at previous RTXGI implementations before Ada released, in those it performs far, far, FAAR better. This hurts PC gaming as a whole. This could get as bad as stutter struggle if we do not act now.

Someone needs to call them out for this. I wish it would be @Dictator
1.) If you have to enable all ray tracing features at once on The Witcher 3, how do you know what the performance hit of RTXGI alone is?
2.) How do you know that the performance hit is on the GPU rather than CPU side?
 
RTXGI in Witcher and Warhammer doesn't scale at all, which is a big problem. And in W3 you cannot turn reflections on without RTXGI either.

Something needs to be done about RTXGI, it's that bad. It sure does look better but a frame drop from 154 fps to 65 is unheard of any single graphical feature. And let's keep in mind there are other RT GI solutions that are far more efficient than this, like Lumen and the one in Metro.

This is bad programming from Nvidia, full stop. Probably intentionally to sell their Ada GPUs as you are dependend on frame generation. If you take a look at previous RTXGI implementations before Ada released, in those it performs far, far, FAAR better. This hurts PC gaming as a whole. This could get as bad as stutter struggle if we do not act now.

Someone needs to call them out for this. I wish it would be @Dictator

There’s no need for evil Nvidia conspiracy theories. Lack of RTXGI quality options may be simply due to the fact that this is a free update. A free update to a 7 year old game isn’t going to “hurt PC gaming”.

It would be nice to understand why it’s so heavy though. Did CDPR amp up probe density, ray count or update frequency for some reason or is it that the latest version of the RTXGI SDK is heavier than what 4A used in exodus.
 
There's way too many variables in play to be banging the conspiracy drum about this. It's an old engine that's effectively deprecated/EOL'd at this point, it's a remaster that was getting handled by Saber Interactive at one point, it's a bolt-on global RT feature that's having to interact with assets and LODs that were never intended for this, it's a multi-platform (ie. console) release prior to the holidays, etc. Really everything about it makes it unlikely to be a well engineered, tightly optimized PC release.
 
1.) If you have to enable all ray tracing features at once on The Witcher 3, how do you know what the performance hit of RTXGI alone is?
2.) How do you know that the performance hit is on the GPU rather than CPU side?
1. That is not correct. You just have to enable RTXGI to use any of the other features. But of course you can just run with RTXGI and leave the other ones disabled.
2. Because Phantom's screenshots are showing 100% GPU usage when RT is enabled. Thus, this is a GPU bottleneck. At 160 FPS the game is CPU limited without RT. I can confirm that as well.
There’s no need for evil Nvidia conspiracy theories. Lack of RTXGI quality options may be simply due to the fact that this is a free update. A free update to a 7 year old game isn’t going to “hurt PC gaming”.

It would be nice to understand why it’s so heavy though. Did CDPR amp up probe density, ray count or update frequency for some reason or is it that the latest version of the RTXGI SDK is heavier than what 4A used in exodus.

Sure, Witcher 3 alone would not be enough evidence to get the Nvidia conspiracy train going. But the thing is, in Warhammer, which is a new release, RTXGI behaves EXACTLY the same. It completely destroys performance beyond repair even on its low setting.

BTW, apparently the raytracing performance is working as it should according to CDPR, because CDPR is NOT investigating the RT performance issues on Nvidia GPUs at all.


There's way too many variables in play to be banging the conspiracy drum about this. It's an old engine that's effectively deprecated/EOL'd at this point, it's a remaster that was getting handled by Saber Interactive at one point, it's a bolt-on global RT feature that's having to interact with assets and LODs that were never intended for this, it's a multi-platform (ie. console) release prior to the holidays, etc. Really everything about it makes it unlikely to be a well engineered, tightly optimized PC release.
As I've said above, RTXGI behaves the exact same in Warhammer.
 
1. That is not correct. You just have to enable RTXGI to use any of the other features. But of course you can just run with RTXGI and leave the other ones disabled.
2. Because Phantom's screenshots are showing 100% GPU usage when RT is enabled. Thus, this is a GPU bottleneck. At 160 FPS the game is CPU limited without RT. I can confirm that as well.


Sure, Witcher 3 alone would not be enough evidence to get the Nvidia conspiracy train going. But the thing is, in Warhammer, which is a new release, RTXGI behaves EXACTLY the same. It completely destroys performance beyond repair even on its low setting.

BTW, apparently the raytracing performance is working as it should according to CDPR, because CDPR is NOT investigating the RT performance issues on Nvidia GPUs at all.



As I've said above, RTXGI behaves the exact same in Warhammer.

So if you underclock the CPU performance doesn't change?
 
Warhammer got an update today. Runs a little bit better. Performance drop with Raytracing is in the normal range for GI and reflections - around 40%.
Is there any noticeable change to the visuals? I wonder if they just reduced the RT quality (ray count/resolution) as opposed to making what they had run better. Any idea if the frame drops improved? I was originally using RT on low for GI and Reflections and it managed to stay around 60 fps with small hordes but once a big one came the framerate dropped heavily and was basically unplayable.
 
Back
Top