GPU Ray Tracing Performance Comparisons [2021-2022]

Im pretty sure the goal was adding RT to an old ass game in the easiest way possible.

Crytek developed their own software RT solution back in 2019, it was based on DX11, hardware acceleration was added quickly through the easiest pathway possible at that time, NVIDIA's proprietary Vulkan API extension, which was developed years before Khronos or AMD finalized their RT Vulcan support.
 
So an ISV has to rewrite the entire renderer for remasters of three old games because an IHV can't implement a feature in the driver?
I mean, did it take Nv months of work to implement the feature? I dunno but they did - and the results are in the benchmarks.
Could Crytek/Saber port the games to D3D12 and make use of DXR? This probably wasn't in their budget otherwise they would've done it.
In any case the results are in the wild, and saying that some IHV can't implement the performance feature because it would require months of work for them won't make the results different.

Considering NV is the only vendor to have a massive driver stack and offers many cursed legacy features as well for the CAD market, only they could realistically afford to implement this feature in their drivers. Supporting interop between different gfx APIs would require a significant amount of bloat in the driver architecture which is arguably too complicated for many vendors to attempt ... (hardly any other vendors tolerates the idea behind interop)

There's a very good reason why the industry outright reinvents the graphics pipeline on a constant basis instead of extending it and it's because each new iteration/successors of gfx APIs weren't intended to be designed with compatibility in mind for prior graphics pipelines in older APIs. We see this all time where each new major release of D3D breaks compatibility with their older iterations (9 vs 10/11 vs 12) and where the Khronos Group wanted a clean break too (GL vs VK). No game developers should ever have to rely on an arcane concept such as interop because vendors shouldn't have to be constrained by the idiosyncrasies behind legacy design decisions. The industry loses out in general if they have to implement this feature since different drivers become less consistent, less stable, involve higher maintenance cost, and old problems would be brought back from the grave where they shouldn't ever have to show up again ...

We should be fortunate that most developers are sane enough to never consider the idea in the first place because otherwise driver support longevity and quality would take a huge regression. It's a big achievement to the industry in itself that D3D12/Vulkan drivers are more usable (compared to past APIs) and/or vendors are working towards this outcome even after support has ended for older line of products. Abusing interop with different gfx APIs would ruin everything that the industry has built upon to make drivers development more sustainable and for long-term viable usage too. You'd lucky to even get 2 or 3 years of driver support out of a vendor and in less than a year after that outdated drivers would be rendered virtually useless in new applications since they have a totally different set of bugs because of weird interactions between different gfx APis ...
 
Last edited:
Considering NV is the only vendor to have a massive driver stack and offers many cursed legacy features as well for the CAD market, only they could realistically afford to implement this feature in their drivers. Supporting interop between different gfx APIs would require a significant amount of bloat in the driver architecture which is arguably too complicated for many vendors to attempt ... (hardly any other vendors tolerates the idea behind interop)

Is this speculation or has Nvidia shared details on how it works and how complicated it is? Maybe it’s just a few hooks to share pointers similar to CUDA and OpenGL/DirectX interop.

There is no standard for how this should be done and Nvidia did it first so it’s perfectly understandable that AMD wouldn’t bother. It’s just 3 old games that won’t change their competitive position.
 
Is this speculation or has Nvidia shared details on how it works and how complicated it is? Maybe it’s just a few hooks to share pointers similar to CUDA and OpenGL/DirectX interop.

There is no standard for how this should be done and Nvidia did it first so it’s perfectly understandable that AMD wouldn’t bother. It’s just 3 old games that won’t change their competitive position.

It really isn't as simple as you imagine it to be. There's likely some proprietary NVAPI extension going on and threading/barriers/image layouts are handled very differently between the two APIs. To top it all off you now have more potential sources of bugs by using more APIs so these issues quickly start becoming intractable for other implementations ...

The remastered trilogy should avoid being included into benchmarks because it's straight up not a level playing field between vendors to make a fair comparison ...
 
Send Crytek an E-mail and ask them about their project priorities instead of a conspiracy?

I happen to know the reason why the game does not have next gen native versions is namely because they would have had to do DX12 and Low Level PS5 switches, which they did not have the budget or scope to do. (Native XSX requires DX12, for example, unlike X1X or X1S which can do DX11).
 
I happen to know the reason why the game does not have next gen native versions is namely because they would have had to do DX12 and Low Level PS5 switches, which they did not have the budget or scope to do. (Native XSX requires DX12, for example, unlike X1X or X1S which can do DX11).
Thanks for the clarification. I'm not sure I follow, though.
They launched the Trilogy Remaster for PC + PS5 + Series S/X. You said the Series X/S version requires DX12, so they had to do the DX11 -> DX12 port anyways, right?
Or is the Series X/S using BC mode?



Just…wow. I guess providing actual evidence of shenanigans is a lot harder than baseless fear mongering.
Wow, it's like I didn't even provide links in my previous post.
 
Considering NV is the only vendor to have a massive driver stack and offers many cursed legacy features as well for the CAD market, only they could realistically afford to implement this feature in their drivers.
I don't see how that latter is connected to the former. It's like saying that since MS is the only vendor with a lot of legacy support in Windows then they are the only one who can make such legacy support going forward.
And that's disregarding the fact that providing a Vulkan interop for D3D11 context isn't exactly "legacy" as neither of these two APIs are deprecated right now.

I think you're seeing a bit more than there is to it. It's highly likely that Nv was the only vendor which Crytek/Saber approached back in the days of planning the Crysis Remastered trilogy development simply because they were the only vendor with RT h/w which could provide some form of acceleration for Crytek's s/w RT solution.
Maybe it was the other way around, and it was Nv who approached Crytek after the Neon Noir demo.
In any case this is highly likely a result of Nv being the first on the market with RT h/w and nothing more.

I doubt that said interop is so complex that Intel or AMD don't have the s/w expertise to implement it in their drivers.
They simply don't care because it's not something which will be used outside of that one project which happened to have started back when they didn't have any RT h/w to accelerate it.

It's a big achievement to the industry in itself that D3D12/Vulkan drivers are more usable (compared to past APIs) and/or vendors are working towards this outcome even after support has ended for older line of products.
I'm not sure how D3D12/Vulkan drivers are "more usable" than D3D11 ones. Care to elaborate?
 
Btw I've just remembered another D3D11 game which Saber is supposedly enhancing with RT - The Witcher 3.

But l'll be surprised if they'll use the interop again as this version is supposed to have RT on new consoles which means that they'll pretty much have to port it to D3D12.
 
The Witcher 3 current-gen revision was pushed to Q2 2022, if not later. Its scheduled for after Cyberpunk 2077.
 
I don't see how that latter is connected to the former. It's like saying that since MS is the only vendor with a lot of legacy support in Windows then they are the only one who can make such legacy support going forward.
And that's disregarding the fact that providing a Vulkan interop for D3D11 context isn't exactly "legacy" as neither of these two APIs are deprecated right now.

D3D11 already is deprecated since virtually all new AAA games are moving on to either D3D12 or Vulkan. Microsoft stopped providing bug fixes to FXC which was their old HLSL source compiler and D3D11 isn't compatible with the new DXC compiler either so it's pretty much the end of the road for it. Once it falls out of wider use, it'll become untested going forward and it'll fall into of disrepair so that eventually every developer out there including indies will be forced into using the new APIs because they're more stable ...

D3D11 isn't even a good fit either for for some architectures so it might as well be considered legacy depending on the vendor ...

I doubt that said interop is so complex that Intel or AMD don't have the s/w expertise to implement it in their drivers.
They simply don't care because it's not something which will be used outside of that one project which happened to have started back when they didn't have any RT h/w to accelerate it.

It is too complicated for either AMD or Intel and the others as well for what it's worth. That's why the industry reinvents the graphics pipeline so that new hardware won't have to be compatible with old ideas or concepts ...

I'm not sure how D3D12/Vulkan drivers are "more usable" than D3D11 ones. Care to elaborate?

Most drivers are never going to reach the status of perfection so by keeping driver design more simple you'll have less bugs or less than ideal behaviour going on with API implementations. Even though outdated drivers won't ever get new features/functionality or bug fixes, at least most of the existing functionality that's already been implemented can be expected to work. On Android where many devices are shipped with outdated drivers and are not fortunate enough to receive updates despite Google introducing a mechanism, new APIs like Vulkan will become a massive deal to them to the foreseeable future. If Google's plan of introducing driver updates for Android devices ultimately fails then the only real backup plan left is to stake out the hope that as a vendor's Vulkan implementation matures, there'll eventually come a point where outdated drivers will become stable and it won't be as bad for developers to target outdated Vulkan drivers compared to the past APIs ...
 
Back
Top