GPU Ray Tracing Performance Comparisons [2021-2022]

Thanks. Really a strange artifact. I expect a second, so i'd call this 'dynamic', but not 'realtime'. But i'd assume exp. moving average grants smooth transitions and no such artifacts.
Maybe they use no moving average to half memory costs, and after a texel in probe changes, image changes abruptly too. :/
 
I played the Resident Evil 8 demo and in the Village part the RTGI is the worst implementation i have seen in games. It is on par with Control's GI.
As I said previously, pretty limited.
i dont understand why there arent more options on the PC
We all know why there aren't any higher options on PC, it's a political thing. Even with max Ray Tracing, reflections are of low quality.
It's the same reason why the game lacks DLSS, despite being upscaled to hell and beyond on consoles.
 
We all know why there aren't any higher options on PC, it's a political thing. Even with max Ray Tracing, reflections are of low quality.
It's the same reason why the game lacks DLSS, despite being upscaled to hell and beyond on consoles.
It's funny that you bring politics up in this case, as if this would be somehow different from the rest?
Adding heavier RT than cards can realistically carry at decent resolutions can be seen just as much as politics as this, just like one could argue from every single design choice that favors one vendor over another.
As for DLSS, do you know the inner workings of RE Engine well enough to call it politics? It's not plug'n'play or even plug'n'pray regardless of engine.
 
Can you name one AMD partnered game which has DLSS?
And? Can you name non-NVIDIA partnered game/engine which has DLSS? No, you can't. NVIDIA is directly involved with every single DLSS game/engine out there. You can choose to view it as politics, but then it's politics both ways.
 
It's funny that you bring politics up in this case, as if this would be somehow different from the rest?
Adding heavier RT than cards can realistically carry at decent resolutions can be seen just as much as politics as this, just like one could argue from every single design choice that favors one vendor over another.

So, better image quality is now "politics"?
 
Sure I can. F1 2020 isn't partnered with NV on PC. Marvel's Avengers was partnered with Intel - didn't stop them from adding DLSS. Now let's get back to my question and you answering it.
Oh, you're right. It's possible NVIDIA wasn't directly involved with them, it could also explain why F1 is such a blurfest with it and Avengers is having strange quality issues too.
So, two exceptions aside where NVIDIA might not have been involved, NVIDIA is behind every other incarnation of it.
My point being you can't just pick the cherries, the road goes both ways. It can be seen as politics to not implement it, just like implementing it could be, but then awfully lot of things suddenly become politics on same basis.

Also, no-one answered whether the engine is even capable of supporting DLSS without major reworking of the internals.

So, better image quality is now "politics"?
No-one said better IQ is politics. We have plenty of dirty history of feature "abuse" to harm the competition.
 
Also, no-one answered whether the engine is even capable of supporting DLSS without major reworking of the internals.
My guess is anything that uses TAA can 'easily' replace that with DLSS, assuming it also uses sub grid jitter for temporal reconstruction of higher resolution, which was mentioned to be the case with 2.0 IIRC.

However, i don't think it's fair to assume anybody basically has to adopt all available proprietary features. To justify this request, NV would need to open source DLSS.
As is, clearly NV has to push adoption of DLSS for every title.
I would not wonder if it comes with a later patch, if AMD partnership allows this and NV wants it to happen.
 
There are a lot of Nvidia partnered games which use FidelityFX as they see fit. What "road" are you talking about?
The road where partnerships with different companies affect features being or not being implemented in a game. The exact thing that could be seen as politics if one so chooses.
FidelityFX is open source, freely moddable and runs on everything. DLSS is vendor specific and closed (to the point of being supposedly a "black box" for most devs (which would suggest NVIDIA involvement even in F1 and Avengers)). Not sure why you're trying to pull them to same convesation.
 
Adding heavier RT than cards can realistically carry at decent resolutions can be seen just as much as politics as this, just like one could argue from every single design choice that favors one vendor over another.
Realistic as being no better than screen space reflections? Realistic as reflections being shitty and pixelated as hell? as shadows being screen space or not even noticeable at all? Is that what you call realistic these days? Wasn't AMD the one calling Turing out for having weak RT performance? and for adding no worthy image quality improvements?

Now they have weaker performance than even Turing and have delegated their entire RT support (and RDNA2 optimizations) for games into microscopic RT additions that are hardly noticeable! We even have something of an outlier here in RE8 where the RT implementation leans heavily on the CPU, thus having a fixed cost that decreases as resolution goes up! All in an effort to bypass their weaker RT hardware.
the road goes both ways
NVIDIA sponsored games have no trouble implementing FidelityFX features, even Cyberpunk has them, but when it comes to AMD titles, they are barred from having DLSS even when they rely on Unreal Engine.
And? Can you name non-NVIDIA partnered game/engine which has DLSS? No, you can't. NVIDIA is directly involved with every single DLSS game/engine out there.
Nope, there are several indie games that implemented DLSS without NVIDIA's involvement (through the UE integration).
 
The road where partnerships with different companies affect features being or not being implemented in a game. The exact thing that could be seen as politics if one so chooses.
FidelityFX is open source, freely moddable and runs on everything. DLSS is vendor specific and closed (to the point of being supposedly a "black box" for most devs (which would suggest NVIDIA involvement even in F1 and Avengers)). Not sure why you're trying to pull them to same convesation.
It’s the same three users in every thread doing all sorts of mental gymnastics to pump up Nvidia/DLSS/RTX. You’re never going to convince them to see any sort of reason, or analyse the state of the industry in any objective manner.

As for REVIII: It’s nice to see a game with RT which doesn’t absolutely tank performance. Hopefully it’ll be the first of many, especially since high end GPUs are rarer than hens’ teeth.
 
As for REVIII: It’s nice to see a game with RT which doesn’t absolutely tank performance. Hopefully it’ll be the first of many, especially since high end GPUs are rarer than hens’ teeth.
I don't see how any neutral observer could realistically object to having more options for customising ray tracing performance, by allowing for higher ray counts that reduce visual artifacts. No one is forcing you to use the higher settings if you have a less capable card.
 
One game where ray tracing is kept to a minimum, and still performs below what an equal NV gpu does, and some are all over the map. Just calm down abit.
 
We all know why there aren't any higher options on PC, it's a political thing. Even with max Ray Tracing, reflections are of low quality.
It's the same reason why the game lacks DLSS, despite being upscaled to hell and beyond on consoles.
ehhh.. perhaps a touch bit unfair here.

You're going to see all games tread this path going forward as the main audience for development will be consoles and they do have parity clauses in there. At the end of the day, nvidia will perform better, but ultimately the most important thing is that these enhanced RT hybrid rendering paths run on the lowest of systems like the Series S. And those consumers who are more than willing to pay the premium or have the luck to get an nvidia card, the more power to them. If running games at 60-120fps with RT on is exactly the type of thing players are looking for, then Nvidia offers that.

But for everyone else, if you're lucky enough to get a device, whether GPU or console to even run it, it is sufficient. I don't think there is a need at this point in time to charge the RT profile all the way up to requiring Nvidia hardware to run it. Once again, whether it's CBR, TAA, DLSS, using RT to augment existing technologies, the goal here isn't to stress hardware needlessly. We should be applauding developers who can get away with a particular look and feel without bombing your system. And to that point, we should be hoping more of these games can run on lower and lower specs instead of higher and higher specs. Your investment lasts longer, the numbers of players that can experience it increases.

Sure it may be inferior to a different algorithm that requires more muscle to run smoothly, but we're already going to be head and shoulders beyond where we used to be. I think Metro proves that, and that will run on consoles (not with everything absolutely on) but enough to make a generation difference in graphics.

With DLSS, that one is a tougher subject to crack. Not all engines may have an easy time supporting it quite yet, and perhaps it's just a matter of time before that happens. It really depends on where the developers want to put their resources, and a company like Capcom has it's roots in console development, so it's understandable that all their solutions are developed for that space. When we talk about low to ultra settings, those are all really just however the developer sees fit. It's not like Ultra is a universal quantifier of graphics prowess one title to the next. Console games are designed specifically for the hardware, so they may not think a lot about this ultra settings concept.

And there could be technologies we don't know about still, and some developers may be opting to curating their engines for those technologies that are vender agnostic (therefore wider spread), and are just waiting for the time in which these features are announced so that they can enable them. It may be inferior to nvidia's solution (honestly, wrt my profession, I think there is a high possibility that DLSS models will not be surpassed by another vendor) but that does not mean there can't exist a compromised solution that provides fairly great upscaling while still holding a decent performance ground over TAA or native. Nor does that imply that the TensorCores will never be leveraged because there isn't a nvidia model in there. Quite the contrary with DirectML.

I get that nvidia owners are peeved that games are going to be designed for the lowest common denominator, and not really take advantage of the power on tap of nvidia hardware; but at the same time, that should really draw out the longevity of the hardware though. This is sort of paradigm is happening in the console space as well; it seems to me, if you design a game for PS5 as your base, XSX more or less falls into parity with PS5. It can't really make enough headway to make a worthwhile difference. But if you design a game for XSX, then perhaps that gap can be expanded on further. Most developers will choose parity in this case, perhaps rightly so considering that's often the result they want and it's likely cheaper labour wise.

Just something that XSX owners will need to accept; a large number of titles will likely end in parity. And that's something nvidia owners will need to accept as well; a lot of the RT algorithm development will be based around adhering to the weakest system and ensure that it will hit 30-60fps. And that's okay honestly, because no one wants to rebuy their 3080/3090 more than every 7-8 years. These GPUs aren't cheap anymore, so I don't really look forward to anyone having to rebuy hardware that quickly.
 
Realistic as being no better than screen space reflections? Realistic as reflections being shitty and pixelated as hell? as shadows being screen space or not even noticeable at all? Is that what you call realistic these days? Wasn't AMD the one calling Turing out for having weak RT performance? and for adding no worthy image quality improvements?
No-one said anything about what kind of graphics are or are not realistic.
Now they have weaker performance than even Turing and have delegated their entire RT support (and RDNA2 optimizations) for games into microscopic RT additions that are hardly noticeable! We even have something of an outlier here in RE8 where the RT implementation leans heavily on the CPU, thus having a fixed cost that decreases as resolution goes up! All in an effort to bypass their weaker RT hardware.
:rolleyes:
NVIDIA sponsored games have no trouble implementing FidelityFX features, even Cyberpunk has them, but when it comes to AMD titles, they are barred from having DLSS even when they rely on Unreal Engine.
FidelityFX is free, open source, moddable code which runs on anything and you can optimize to hell and back for anything. It doesn't belong in the same conversation with DLSS.
Nope, there are several indie games that implemented DLSS without NVIDIA's involvement (through the UE integration).
Which is why I said direct involvement with game/engine. NVIDIA was very much involved in implementing DLSS in Unreal engine.
 
But for everyone else, if you're lucky enough to get a device, whether GPU or console to even run it, it is sufficient.

And thats where scaling comes in, already last generation we had games scaling all the way from high end hardware down to the switch, while still looking the best of the generation. In special now with the 4TF xss, scaling become even more so important. Most titles so far do scale pretty well, RE is more of an outlier so far, we cant draw conclusions from that i think.

I get that nvidia owners are peeved that games are going to be designed for the lowest common denominator

Doubt that, regarding nv GPU owners. Most games so far do deliver quite much of extra performance in just about any RT game released so far. Consoles arent holding back hardware so much anymore, thanks to scaling these days being in a much better state then previous generations.
Lets not forget that the RDNA2 gpus on pc also are much more capable then whats found in consoles.

Not all games will scale equally well, RE is an example, but even in that case, theres more performance to be found on the pc side of things.
 
Back
Top