GPU Ray Tracing Performance Comparisons [2021-2022]

What I don't understand is why some PC gamers are celebrating the lack of console+ settings in this game.


You got it wrong.


Some people here are not "celebrating" the lack of console+ settings of this game. Some people here are contesting the use of this fact (lack of settings) as valid evidence of politics, by the ones in here who are trigger happy in their conclusions (if said conclusions side against a very particular IHV).


Its about being fair and correct, to keep the discussion grounded on facts instead of conjecture.
 
The point is that one IHV right now seem to be if not straight up blocking implementations of advanced features then just being content with PC versions of multiplatform games staying at console graphical levels because this is the way they can showcase their h/w the best - the less optimizations and options there is the better.

The point is that there is no proof for any of what you just accused a certain IHV of doing. You can say its your opinion based on what you "feel". and that is it.

I completely fail to understand how when two new games appear with very good RT performance on AMD, the entire attention is directed at RE8 + AMD conspiracy. What about the other game? What conspiracy is there to explain Metro Exodus AMD RT performance?

This is what I want to know.
 
In that particular bench it cuts the performance on AMD GPUs in half. It gets worse depending on the scene. I think the RT in RE Village is certainly better than this.

Better in what way though? Because the RT settings in Tombraider scale in terms of implementation and performance hit. So if you're looking for a mild visual improvement for a mild performance hit, you can use the low setting. If you want a more dramatic visual improvement and are happy to sacrifice the performance, then you can choose the highest setting. I still don;t see what the problem is with providing the option. Surely PC gaming is all about pushing the limits of the hardware. If you're saying it has a more significant visual impact then I can't really comment without having seen some more on/off comparisons for both games. Certainly the pics posted above for Tomb Raider looked like an obvious and fairly dramatic improvement to me.

You got it wrong.

Some people here are not "celebrating" the lack of console+ settings of this game. Some people here are contesting the use of this fact (lack of settings) as valid evidence of politics, by the ones in here who are trigger happy in their conclusions (if said conclusions side against a very particular IHV).

So we all agree that games giving additional scalable graphics settings and quality options that it's up to the user whether or not to activate is a good thing? Even if those settings perform better on one IHV than the other?
 
Better in what way though? Because the RT settings in Tombraider scale in terms of implementation and performance hit. So if you're looking for a mild visual improvement for a mild performance hit, you can use the low setting. If you want a more dramatic visual improvement and are happy to sacrifice the performance, then you can choose the highest setting. I still don;t see what the problem is with providing the option. Surely PC gaming is all about pushing the limits of the hardware. If you're saying it has a more significant visual impact then I can't really comment without having seen some more on/off comparisons for both games. Certainly the pics posted above for Tomb Raider looked like an obvious and fairly dramatic improvement to me.



So we all agree that games giving additional scalable graphics settings and quality options that it's up to the user whether or not to activate is a good thing? Even if those settings perform better on one IHV than the other?

Better visual return on performance and better visual return in general. Tomb Raider RT below the ultra setting does very little so its not really a situation where you can just scale the RT and get an equivalentish visual uplift relative to the performance required. And even at the Ultra setting you are paying such a huge price and the difference doesn't come close to being substantial. A few cherry picked screens does not represent the general uplift. It’s not a matter of AMD being poor at ray tracing compared to Nvidia. Many of us genuinely just haven’t found many RTX implementations worth the performance hit. In most cases they just refine the image while cratering performance. Hopefully we get more Metro Exodus type implementations.
 
Many of us genuinely just haven’t found many RTX implementations worth the performance hit.

But does that mean you think we'd be better off if the developer hadn't included those implementations in the first place? Or perhaps just enabling the low/medium settings and removing the high/ultra settings as options?
 
So we all agree that games giving additional scalable graphics settings and quality options that it's up to the user whether or not to activate is a good thing? Even if those settings perform better on one IHV than the other?

Is that the discussion we were having? That was not the discussion we were having.

And look at that, it has the "likes" from the entire NV team of this forum. As if your comment was what was being discussed. It was not.
 
But does that mean you think we'd be better off if the developer hadn't included those implementations in the first place?

It doesn't mean any of what you are talking about. That, was not the subject.


The subject, and what I want to know, is how come RE8 RT options became underfire as a justification for AMD RT performance. Meanwhile, Metro, also performing great, no one talks about.
 
So we all agree that games giving additional scalable graphics settings and quality options that it's up to the user whether or not to activate is a good thing? Even if those settings perform better on one IHV than the other?
Surely everybody agrees on this.

I see two questions arising here:

1. Is there a pattern of disappointing RT support on AMD sponsored titels?
If so, we get new questions form that:
Is AMD HW too weak with HW RT, so they tone the feature down to keep fps up?
Is AMD software R&D weaker than NVs? Do game developers rely on this at all (considering it's their task not that of an HW vendor)? Can we expect strong software R&D from any IHV just because NV has it?

On the HW side, HW is quite behind at the moment, but they can easily support any future API changes. So the HW might age well, which is what's needed for a console generation?
On the SW side i don't request anything from them, other than API extensions to expose flexibility. Superresolution is fine for those who stick their nose on a 4K screen. At comfortable distance this whole thing feels bogus to me.

2. Does RT add a new dimension to the 'lazy consloe port' debatte?
Can it be scaled up to high end PC easily?

In case of RE8 i'd say not really, because reflections don't matter much for a dull and dark horror setting like this, and mixed baked / dynamic GI is pointless anyways IMO.
I expect similar situations in many games which implement only a small number of RT effects. But hey - this is still mostly cross gen time. Not that impressive yet but it's something.
 
I’m not sure if all of RT can be scaled up. It really depends on how they are leveraging RT to augment the lighting pipeline. If it is as simple as increasing rays or increasing denoising math perhaps there is valid reason to talk about scaling. But if they are using RT rays to resolve or augment traditional lighting methods perhaps adding or removing probes does not really increase fidelity at a cost of significantly more load.

I don’t see this issue as much different from seeing some animation systems stuck at 30fps even though a game can render much higher.
 
It doesn't mean any of what you are talking about. That, was not the subject.
The IHV issue was *your* subject, but as is the nature of internet threads there was another related discussion that got intertwined when @techuse stated that some people may prefer the cost/benefit ratio of RT in RE:Village to the range of cost/benefit ratios offered by other games.
 
Maybe unrelated, but about RE8, I think Alex said in the last DF video that the pc version have problems with temporal stuff ? Maybe it's impacting RT quality too ?
 
Better visual return on performance and better visual return in general.
Two of the above shots can be obtained with Medium RT.

Better visual return in general where? Reflections don't look better than SSR which defeated the purpose of RT reflections in the first place. GI is only noticeable in certain places and is quite laggy.

Once more, these are nothing but low console settings RT disguised as High, there is nothing performant about it.

Is there a pattern of disappointing RT support on AMD sponsored titels?
Yep there is, Dirt 5, Godfall, and now RE8.
What about the other game? What conspiracy is there to explain Metro Exodus AMD RT performance?
What about Metro? Let me see .. scalability? check, massive visual quality upgrade? check, upscaling tech? check.

And about the performance .. at Ultra the 3090 is 50% faster, so not sure what you are celebrating.
 
The IHV issue was *your* subject, but as is the nature of internet threads there was another related discussion that got intertwined when @techuse stated that some people may prefer the cost/benefit ratio of RT in RE:Village to the range of cost/benefit ratios offered by other games.

Just making sure the unanimous approval of the unrelated subject everybody agrees on (settings should be available), was not used to somehow justify their blaming of RE8 settings on the IHV, because that comment was made in direct response to my attempt to dismantle that theory.
 
What about Metro? Let me see .. scalability? check, massive visual quality upgrade? check, upscaling tech? check.

And about the performance .. at Ultra the 3090 is 50% faster, so not sure what you are celebrating.


So Metro, only being 50% faster on Ampere and having Turing being beaten, on what everybody expected to be a bloodbath for AMD, does not deserve the scrutiny to understand the performance.

But RE8, being only 20% faster on Ampere, is an issue deserving of a witch hunt. A witch hunt not on the developer, but on a IHV.

So, no, no celebration here, everybody loses in this discussion. But thanks for the choice of words.
 
But RE8, being only 20% faster on Ampere, is an issue deserving of a witch hunt. A witch hunt not on the developer, but on a IHV.
Lack of scalability and the consistent use of low quality RT is the core issue here, not anything else.
So Metro, only being 50% faster on Ampere and having Turing being beaten,
This was the case in the first Metro by the way, if anything it has became worse for AMD with the new Metro. The performance difference has expanded in favor of Ampere.
 
Lack of scalability and the consistent use of low quality RT is the core issue here, not anything else.

No, that was not the issue. The issue is how RE8 settings are being blamed on IHV forcing its politics. Don't spin this.

This was the case in the first Metro by the way, if anything it has became worse for AMD with the new Metro. The performance difference has expanded in favor of Ampere.

Again, not the issue being discussed.

I want to know why the better than expected performance on RE8 is being scrutinized, but not the one on Metro. Is there a cut off anywhere between 20% and 50% where AMD is off the hook if performance doesn't surpass that line? because 50% down on Ampere is a hell of a result and no less deserving than RE8 scrutiny.
 
No, that was not the issue. The issue is how RE8 settings are being blamed on IHV forcing its politics. Don't spin this.
Yes, consistent use of low and/or limited RT settings for games backed by that IHV. What is it so hard to understand?

I want to know why the better than expected performance on RE8 is being scrutinized, but not the one on Metro.
For the damned third time: Metro has scalable quality RT, use of upscaling, and good use of RT acceleration hardware. RE8 has none of that. Just like Godfall, Dirt 5, WoW ..etc.

Nobody cares about the performance here except you. And what is better than expected? AMD loses twice as much fps here using console RT settings.
 
No, that was not the issue. The issue is how RE8 settings are being blamed on IHV forcing its politics.

I think we can drop the "politics" thing. It was just between you and one other poster and the thread moved on a long time ago anyway.

The more interesting and relevant topic IMO is whether we should be content with middling IQ in exchange for middling performance impact. I recall when we used to celebrate developers like DICE and Crytek who would push hardware to the limit. As long as developers provide scaling options I don't see why anyone should have anything to complain about. The main issue with RE8 is that there is an obvious wasted opportunity for higher IQ at the cost of performance.
 
The problem here aint RDNA2 pc gpus either, as they perform quite ok compared to Turing. Its the consoles that once again get brought up, with their middling IQ to have extremely limited RT going on.

I doubt 6800xt users are very dissapointed in RE performance when enabling the subtle RT thats there. See, for such a user, they dont net much visual return by enabling RT, but dont loose much performance either.

Its a game that has limited RT, due to consoles or not it doesnt matter why. The vast majority of games released so far actually scale very well towards higher end hardware and arent held back by 10TF bw limited gpus found in playstation.
its more of an outlier than an example.

im sure UE5 will scale pretty well, and see what Metro and later CP2077 show us.
 
The problem here aint RDNA2 pc gpus either, as they perform quite ok compared to Turing. Its the consoles that once again get brought up, with their middling IQ to have extremely limited RT going on.

I doubt 6800xt users are very dissapointed in RE performance when enabling the subtle RT thats there. See, for such a user, they dont net much visual return by enabling RT, but dont loose much performance either.

Its a game that has limited RT, due to consoles or not it doesnt matter why. The vast majority of games released so far actually scale very well towards higher end hardware and arent held back by 10TF bw limited gpus found in playstation.
its more of an outlier than an example.

im sure UE5 will scale pretty well, and see what Metro and later CP2077 show us.

Wasn't nvidia saying lately only something like 10% of their current market share is RT enabled? A ton of old pc hw is keeping things back more than new consoles. At least consoles allow for some level of RT, SSD etc. to be used.
 
Last edited:
Back
Top