GPU Ray Tracing Performance Comparisons [2021-2022]

This thread is for talking about the RT technology, not trying to market the technology to others. Once the consoles have multiple AAA releases & feature a fully traced lighting & soundfield, is when ray tracing will move mainstream. Until then, these titles are forced and niche. Nobody who plays Battlefield, has ever cared how realistic reflection in puddles are. Gamers instinctively turned that stuff off, because performance.

If you are a content creator then RTX is your only choice. Even those with $4k quadro cards, were snapping up 3090's for $3k because it had greater Enterprise performance. But if we are strictly speaking about gaming performance, then all RT games will perform horribly, unless you have a $1,499* dGPU.



No need to overstate RT's importance in buying, when we are all discussing the poor performance and comparing usage between the different game engines & hardware. Not even the best dGPU is enough to turn RT on in any e-sport.



Gamer Fact: Raster is king. No need for those RT guys to feel inferior... RT's time will come. My $900 RTX 2080 will never have the power to run RT in games. Ever..

Yes, framerate is king in esports/PVP titles so not only does the RT setting gets switched off but a lot of other IQ related settings get turned down or off if necessary to achieve the highest possible fps.

However, just because almost everyone loves playing PVP based titles and spend a lot of time on them. It doesn't mean thats all they play and somehow RT is irrelevant regarding their buying decisions. Devs wouldn't spend billions of dollars a year developing SP games if no one was buying those games. IQ matters way more in that part of the market and RT is relevant to plenty of gamers that enjoy that space.

"Once the consoles have multiple AAA releases & feature a fully traced lighting & soundfield, is when ray tracing will move mainstream."

GPU features and performance are a product of evolution not revolutionary hardware magically appearing out of nowhere. In reality, to get to fully traced lighting & soundfield games, gamers have to care about and invest in ever evolving RT hardware and software.
 
Last edited:
However, just because almost everyone loves to playing PVP based titles and spend a lot of time on them.
Never play these. Have no idea what the fuck he's mumbling about. 15 fps is fine by me if it's resulting in some incredible unbelievable graphics and doesn't ask me to do twitch input actions really. I'll get to 30 from there by adding DLSS or something and will be fine.
 
DegustatoR - you have no idea that multiplayer games are popular - seriously
Oh I know that they are popular.
I also know that single player games are even more popular, especially when you compare the number of releases in both categories.
And I also know that RT in itself doesn't mean that you suddenly can't get 500+ fps in some MP title - it all depends on an implementation, the fact that most MP gamers are using 1080p displays and are usually CPU limited more than GPU limited.
So still, no idea what the hell he's even trying to say.
 
In regards to the article above, the reason why there's no hardware acceleration for AMD is because their drivers do not support D3D11 and Vulkan interop. Interop between gfx APIs is cursed stuff from a driver implementation perspective so the subpar performance seen on other vendors can mostly be chalked up to flawed design in the application. The only way to access hardware acceleration for ray tracing on other vendors is for games to use D3D12 or Vulkan in it's entirety otherwise the Crysis Trilogy remains as a faulty point of comparison ...
 
Interop : Does that mean Crysis is using d3d and vulkan at the same time ?

Yes and it's massive challenge from a driver design standpoint to implement this feature. It would takes months of work and lot's of hacks or workarounds just to get one feature working in a couple of applications for which most vendors wouldn't find to be worthwhile. It'd easier for other drivers if games just stuck to using gfx APIs that offers native functionality instead. The remastered trilogy's graphics code could easily be described as being "Frankenstein" since that is repulsive stuff that no vendors should ever have to implement ...

Only proper fix is to rewrite the entire renderer so that others can be potentially at feature parity. Relying on gfx interop in the drivers is a bad idea in general ...
 
Yes and it's massive challenge from a driver design standpoint to implement this feature. It would takes months of work and lot's of hacks or workarounds just to get one feature working in a couple of applications for which most vendors wouldn't find to be worthwhile. It'd easier for other drivers if games just stuck to using gfx APIs that offers native functionality instead. The remastered trilogy's graphics code could easily be described as being "Frankenstein" since that is repulsive stuff that no vendors should ever have to implement ...

Only proper fix is to rewrite the entire renderer so that others can be potentially at feature parity. Relying on gfx interop in the drivers is a bad idea in general ...
So an ISV has to rewrite the entire renderer for remasters of three old games because an IHV can't implement a feature in the driver?
I mean, did it take Nv months of work to implement the feature? I dunno but they did - and the results are in the benchmarks.
Could Crytek/Saber port the games to D3D12 and make use of DXR? This probably wasn't in their budget otherwise they would've done it.
In any case the results are in the wild, and saying that some IHV can't implement the performance feature because it would require months of work for them won't make the results different.
 
Question : why would an IHV implement such a feature in the first place - it seems to me like a bizarre feature. Do nvidia also support a game being rendered with d3d and opengl simultaneously ?
 
Question : why would an IHV implement such a feature in the first place - it seems to me like a bizarre feature. Do nvidia also support a game being rendered with d3d and opengl simultaneously ?
The obvious answer is obvious - to provide better performance in the three games which use the feature.
From a strategic perspective this makes little sense of course because it's highly unlikely that there will ever be more than these three games which would use RT h/w this way.

And I'm not entirely sure that this feature is that difficult to implement tbh.
If a renderer uses two different APIs to render frames and both APIs are present in the system and supported by the GPU driver then what else is there to implement?
 
In regards to the article above, the reason why there's no hardware acceleration for AMD is because their drivers do not support D3D11 and Vulkan interop. Interop between gfx APIs is cursed stuff from a driver implementation perspective so the subpar performance seen on other vendors can mostly be chalked up to flawed design in the application. The only way to access hardware acceleration for ray tracing on other vendors is for games to use D3D12 or Vulkan in it's entirety otherwise the Crysis Trilogy remains as a faulty point of comparison ...
It's not "AMD drivers not supporting". It's Crytek using NVIDIAs API which allows such interoperability.
 
The obvious answer is obvious - to provide better performance in the three games which use the feature.
I'm pretty sure any kind of interop will add some more performance cost/overhead due to additional calls etc.


And I'm not entirely sure that this feature is that difficult to implement tbh.
If a renderer uses two different APIs to render frames and both APIs are present in the system and supported by the GPU driver then what else is there to implement?
If you will like to do that proper way, you will use DX12 with DXR or Vulkan with corresponding RT extensions :)

For some reasons, Crytek used their "current" DX11 rendering path and they implemented/glued up interop to vendor locked extension VK_NV_ray_tracing with some proprietary intrinsics.

I still don't understand why they haven't used Khronos standardized and IHW agnostic VK_KHR_ray_tracing_pipeline extension instead
 
Question : This interop functionality have nvidia patented it ?

Most probably not, but it sure makes getting compatibility with non-Nvidia graphics cards a whole lot more complicated.

Which may or may not have been the goal of using such a setup from the start.
 
Back
Top