NVIDIA to create the RTX API with Microsoft

It seems NVIDIA has a new GameWorks library for Volta GPUs, this library is exclusive to NVIDIA, it's for ray tracing certain effects such as AO, reflections, and area shadows. The library is co developed with Microsoft to be integrated with DirectX Raytracing API and is being adopted by these engines and developers: Unreal, Unity, Frostbite, 4A (Metro Exodus), and Remedy (Northlight engine?).

NVIDIA-RTX-Technology-740x262.jpg


https://videocardz.com/newz/nvidia-to-announce-rtx-technology
https://wccftech.com/nvidia-gameworks-ray-tracing-api-microsoft/
 
That level of co-development may have implications going forward. Microsoft wouldn't go in on this if it were to remain niche.

Does this have similar implications for Volta derivatives in the market? If this was for DX-based games and engines, it would be really niche if it only covered the Volta products we know of now.
 
Does this have similar implications for Volta derivatives in the market? If this was for DX-based games and engines, it would be really niche if it only covered the Volta products we know of now.
It seems it implies next gen is going to be Volta, not Ampere or Turing.
Also the wording doesn't make RTX exclusive to Volta, it just states it's heavily optimized for it.
 
Likely limited to Volta for 12 months after release of the gaming line then magically optimized for Pascal after that.
 
That level of co-development may have implications going forward. Microsoft wouldn't go in on this if it were to remain niche.

Does this have similar implications for Volta derivatives in the market? If this was for DX-based games and engines, it would be really niche if it only covered the Volta products we know of now.
IIRC Many DX12.1+ features have only been on Nvidia GPUs for quite some time until Vega?; MS' relationship with nvidia, made a lot of those custom features part of DX12 feature set.
 
IIRC Many DX12.1+ features have only been on Nvidia GPUs for quite some time until Vega?; MS' relationship with nvidia, made a lot of those custom features part of DX12 feature set.

None of that stuff was custom. Intel had parts of the features much longer and all DX12.1 features already in skylake.

DirectX Raytracing API means it'S vendor agnostic, that sounds good. Probably all vendors are involved there, but maybe parts are taken from Nvs own efforts, like DX12 took parts from mantle.
 
DirectX Raytracing API means it'S vendor agnostic, that sounds good. Probably all vendors are involved there, but maybe parts are taken from Nvs own efforts, like DX12 took parts from mantle.
Highly unlikely it's vendor agnostic, it's called RTX and is an effort by NVIDIA. They are the one announcing it at GDC.
 
But hardware acceleration (RTX) of it is only available on Volta and later GPUs.
So it would seem that "primitive operations" such as:

Meanwhile DXR will introduce multiple new shader types to handle ray processing, including ray-generation, closest-hit, any-hit, and miss shaders.

from https://www.anandtech.com/show/12547/expanding-directx-12-microsoft-announces-directx-raytracing

present the opportunity to accelerate in hardware.

The age old problem with *-tracing is the coherency of memory accesses that are performed, not the math. So a particular kind of memory acceleration/hierarchy might be the cornerstone of actual acceleration. It might be a particular kind of coalescing (which is a way to think about delta colour compression).

There have been attempts at hardware implementations in the past which have promised astonishingly high performance, but they seem to have failed. The implication, though, is that hardware acceleration can be transformative.

The other side of the coin may well be that this uses ray-tracing for a subset of rendering tasks - so we're still years away from purely *-traced games. The original promise of *-tracing is that it massively simplifies graphics rendering, since modern realtime graphics is just a very very very long chain of physically plausible kludges. So this is going to be another kludge in the toolchest.

I personally find it worrying that NVidia talks about de-noising. One can argue that temporal anti-aliasing is a kind of denoising and that works great. Except, well, it doesn't in my opinion, but there's plenty of time to have that argument.
 
Noise is the achilles heel of ray-tracing/path-tracing, its not solved in the offline world, so shouldn't expect it in the real-time. However there are smart people spending a lot of time on denoising including our very own Nao AKA Marco Salvi (tho hasn't posted at B3D for a long time). Its usually spatio-temporal in class and deep learning is being applied to help find the best result from the lowest number of samples.
Noise and AA are both sides of the same coin, how to get the best visuals from as few samples as you can.
 
I personally find it worrying that NVidia talks about de-noising. One can argue that temporal anti-aliasing is a kind of denoising and that works great. Except, well, it doesn't in my opinion, but there's plenty of time to have that argument.
They say it's there to prevent time sapping. So maybe not that integral in the grand scheme of things?
Note: a ray-tracing denoiser module is coming to the GameWorks SDK, which will enable developers to remove film grain-like noise without any additional time-sapping development work
https://www.geforce.com/whats-new/articles/nvidia-rtx-real-time-game-ray-tracing
 
They say it's there to prevent time sapping. So maybe not that integral in the grand scheme of things?

https://www.geforce.com/whats-new/articles/nvidia-rtx-real-time-game-ray-tracing
I think you'll find it's the opposite. De-noising is essential for real time ray tracing, because it produces a huge amount of noise.

Of course the Gameworks implementation is there to maximise the number of games that come only with NVidia optimised de-noising and other hardware will be maximally disadvantaged at the same time. Because lazy devs, who don't like their time to be sapped by pesky details like de-noising.
 
In short, DXR (DX Ray Tracing) is indeed supported on all GPUs, as it's based on DX12. But hardware acceleration (RTX) of it is only available on Volta and later GPUs. Some games will already ship with RTX/DXR this year.
https://www.anandtech.com/show/1254...tracing-acceleration-for-volta-gpus-and-later
https://www.anandtech.com/show/12547/expanding-directx-12-microsoft-announces-directx-raytracing
Little late to the party but to my understanding, DXR can (and surely will) be hardware accelerated. RTX works with DXR to enable AI-based denoising which utilizes Volta-hardware (probably tensor-cores), but RTX is not part of DXR per se nor is it "DXR hardware acceleration", it's no different really from GameWork-modules. Or then I've understood something really wrong.
 
Little late to the party but to my understanding, DXR can (and surely will) be hardware accelerated. RTX works with DXR to enable AI-based denoising which utilizes Volta-hardware (probably tensor-cores), but RTX is not part of DXR per se nor is it "DXR hardware acceleration", it's no different really from GameWork-modules. Or then I've understood something really wrong.

See here:
This March Microsoft announced DirectX Raytracing (DXR), an extension to DirectX 12 API - DXR will provide a standard API for hardware and software accelerated ray tracing under DirectX. Microsoft also released D3D12 Raytracing Fallback Layer, a library that emulates the DirectX Raytracing API on devices without native driver/hardware support. The Fallback Layer is a thin layer on top of DirectX shared as an open source library via Microsoft's official GitHub repo. The Fallback Layer uses Compute Shaders to implement DXR functionalities.
https://forum.beyond3d.com/posts/2031581/
 
Back
Top