GPU Ray Tracing Performance Comparisons [2021-2022]

Most drivers are never going to reach the status of perfection so by keeping driver design more simple you'll have less bugs or less than ideal behaviour going on with API implementations. Even though outdated drivers won't ever get new features/functionality or bug fixes, at least most of the existing functionality that's already been implemented can be expected to work. On Android where many devices are shipped with outdated drivers and are not fortunate enough to receive updates despite Google introducing a mechanism, new APIs like Vulkan will become a massive deal to them to the foreseeable future. If Google's plan of introducing driver updates for Android devices ultimately fails then the only real backup plan left is to stake out the hope that as a vendor's Vulkan implementation matures, there'll eventually come a point where outdated drivers will become stable and it won't be as bad for developers to target outdated Vulkan drivers compared to the past APIs ...
Sorry but I totally disagree with that in the case of DX11 vs DX12
DX12 has been the most horrendous API launch ever, with vast majority of titles performing worst on DX12 than DX11. This happens when you transfer part of the 3D pipeline to the hands of developers cursed by unrealistic deadlines, multi GB day-one patches and barely beta public state when games hit the shelves, instead of the experienced IHV gurus knowing everything about their hardware and improving their driver time over time for all titles. Even today, 6 years later (sic!), most DX12 titles are still a stutter fest at launch with so many bugs that any fix is a multi page novel. Close to metal APIs are a nightmare :cry::cry::cry:
 
Sorry but I totally disagree with that in the case of DX11 vs DX12
DX12 has been the most horrendous API launch ever, with vast majority of titles performing worst on DX12 than DX11. This happens when you transfer part of the 3D pipeline to the hands of developers cursed by unrealistic deadlines, multi GB day-one patches and barely beta public state when games hit the shelves, instead of the experienced IHV gurus knowing everything about their hardware and improving their driver time over time for all titles. Even today, 6 years later (sic!), most DX12 titles are still a stutter fest at launch with so many bugs that any fix is a multi page novel. Close to metal APIs are a nightmare :cry::cry::cry:

That's what makes D3D12 so easy to maintain! Some vendors don't have to make a fast driver anymore and all they have to do is make a stable and working driver. Making the driver run fast isn't the vendor's problem and is the application developer's problem. No more introducing performance hacks or at the very least they're discouraged. That and it's more modern graphics pipeline design which is arguably better match for modern hardware design makes driver development simpler ...
 
D3D11 already is deprecated since virtually all new AAA games are moving on to either D3D12 or Vulkan. Microsoft stopped providing bug fixes to FXC which was their old HLSL source compiler and D3D11 isn't compatible with the new DXC compiler either so it's pretty much the end of the road for it. Once it falls out of wider use, it'll become untested going forward and it'll fall into of disrepair so that eventually every developer out there including indies will be forced into using the new APIs because they're more stable ...
The fact that there are no new features being added (and hence no updates needed to DXBC) doesn't mean that the API is deprecated. Windows itself is still using it, and there are still more DX11 titles being released now than DX12/VK ones. While this is and will change for high profile games shipping on consoles I don't think that we're anywhere close to a point at which DX11 could be considered "deprecated".

D3D11 isn't even a good fit either for for some architectures so it might as well be considered legacy depending on the vendor ...
That's a vendor issue, not the API one.

It is too complicated for either AMD or Intel and the others as well for what it's worth.
That's completely different from saying that it's just too complicated.

Most drivers are never going to reach the status of perfection so by keeping driver design more simple you'll have less bugs or less than ideal behaviour going on with API implementations.
Yeah, well, the problem with that is that D3D12 drivers on practice are hardly "more simple" than D3D11 ones.

Even though outdated drivers won't ever get new features/functionality or bug fixes, at least most of the existing functionality that's already been implemented can be expected to work.
What functionality doesn't work in D3D11 drivers right now? And going forward I see a lot more issues with future D3D12 support compared to future D3D11 one as D3D12 is a lot more app/implementation dependent meaning that it will be a lot harder to provide OS/driver side support on future platforms once D3D12 itself will become "deprecated".

there'll eventually come a point where outdated drivers will become stable and it won't be as bad for developers to target outdated Vulkan drivers compared to the past APIs
From what I see on how things are on Android wrt Vulkan support it seem to be the exact opposite so far...

That's what makes D3D12 so easy to maintain! Some vendors don't have to make a fast driver anymore and all they have to do is make a stable and working driver. Making the driver run fast isn't the vendor's problem and is the application developer's problem. No more introducing performance hacks or at the very least they're discouraged.
So it's a good thing that the application is running badly and it's impossible to fix that on OS/IHV side? Because why? Because all h/w vendors are equally bad now? I don't get the reasoning from a POV of a regular user.
 
The fact that there are no new features being added (and hence no updates needed to DXBC) doesn't mean that the API is deprecated. Windows itself is still using it, and there are still more DX11 titles being released now than DX12/VK ones. While this is and will change for high profile games shipping on consoles I don't think that we're anywhere close to a point at which DX11 could be considered "deprecated".

No new updates/features/maintenance = deprecated

Only reason for Microsoft and the vendors to keep it is for compatibility with old applications. They discourage future use of it regardless ...

That's a vendor issue, not the API one.

The many iterations of D3D would seem to disagree. We're on D3D12 now ...

Yeah, well, the problem with that is that D3D12 drivers on practice are hardly "more simple" than D3D11 ones.

If AMDVLK is anything to go by, the number of invasive performance hacks is relatively tame in comparison D3D11 and there's the fact that HW vendors offer many more driver extensions in D3D11 that doesn't have a native equivalent by itself. D3D12 exposing an explicit mGPU API is certainly pleasant given that mGPU has been a failure on prior API iterations ...

What functionality doesn't work in D3D11 drivers right now? And going forward I see a lot more issues with future D3D12 support compared to future D3D11 one as D3D12 is a lot more app/implementation dependent meaning that it will be a lot harder to provide OS/driver side support on future platforms once D3D12 itself will become "deprecated".

Well for one thing the "deferred contexts" feature in D3D11 is straight up flawed in design along with the previously mentioned FXC bugs that are never going to be fixed. D3D12 performance is more dependent on the application but other than that the drivers are much simpler over there ...

From what I see on how things are on Android wrt Vulkan support it seem to be the exact opposite so far...

The Android Vulkan drivers are getting better but you won't be able to observe this with older devices since their drivers are never updated. On new devices that ship with very recent drivers, their Vulkan implementation is starting to get decent so if vendors keep it up eventually outdated drivers won't be as much of a problem in the future ...

So it's a good thing that the application is running badly and it's impossible to fix that on OS/IHV side? Because why? Because all h/w vendors are equally bad now? I don't get the reasoning from a POV of a regular user.

Well there is no reason for an end user to really care. It's mostly a benefit to developers and vendors to experience ...
 
Screenshot2021102616.png

https://www.computerbase.de/2021-10...t/2/#abschnitt_die_performance_von_raytracing

Marvel's Guardians of the Galaxy, on the other hand, shows how to do it right. Yes, there are tons of RT reflections in the game too, but at least that fits into the picture here: Alien worlds can also look alien. What is even more important, however, is that although there are reflections that can be expected from mirrors, these are a rarity. Mostly diffuse reflections can be seen because surfaces are not mirrors, but more matt. Despite the many RT reflections, RT seems anything but exaggerated in this title.

In short, it can be said that Marvel's Guardians of the Galaxy offers by far the best integration of ray tracing reflections that a game has shown in the editorial office to date. If you do not use ray tracing, you will lose quite a bit of graphic quality in this title despite proper screen space reflections
The quality level of the RT reflections is not so important from an optical point of view. All effects offer a very good result even with the high setting, "Very high" and "Ultra" then primarily turn a bit on the RT resolution (a quarter, a third and half of the rendering resolution should be behind the levels). In addition, the rays seem to "bounce" a little more often. The latter can only be seen on a few objects anyway and that is actually never noticeable. The former also becomes a marginal issue due to the matt reflections. Conclusion: You don't really need more than RT on “High”.

If you have too much GPU power left, the very high ray tracing reflections are still useful. The ultra setting, on the other hand, is quite a waste of resources: even with clear reflections, hardly anything happens.

In contrast, the transparent RT reflections, which enable ray tracing on transparent surfaces, are much more important than the higher RT settings. This is an effect that is rarely seen, but then often makes a huge difference. Whether nothing of the surroundings is reflected in a large window or almost everything is noticeable. This option should therefore be active regardless of the RT level.
 
Last edited:
So In Guardian of The Galaxy, the 3080 is any where between 38% and 51% depending if you use Very High or Ultra RT.

Considering the huge visual gains the RT adds to this game, it's presence is of an outmost importance.
They preface it by stating there is little reason to use more than high on any GPU.
 
"Conclusion: You don't really need more than RT on “High”"
Skill Up says RT Ultra kills performance disproportionately to the IQ advantage it gives over High and Very High, at least in his 2080 Ti.
 
Conclusion: You don't really need more than RT on “High”
Conclusion: RT adds a lot in the game, and even at it's lowest preset, the 3080 is 25% faster than the 6800XT, that is a considerable margin still. You can bet that margin increases if the scene contained extensive reflections.
 
Last edited:
The art style in Guardians of the Galaxy really isn’t doing it for me. All the colors are blending together and it seems overly busy. Maybe it’s better when you’re actually playing and not watching videos.
 
That Marvel game looks sooo extremly good visually. It's probably one of the best looking PC games that has ever been released.

I don't know why nobody talks about it, atleast the tech is certainly worth discussing.
 
Back
Top