Nvidia Turing Architecture [2018]

Devs don't "prefer", they can't use PS their is no way to use them, no api, nothing (that's what Sebastian Aaltonen said on twitter few months ago when I asked him if PS was faster than his gpu compute culling code)

Yes, there´s no API, to my knowledge AMD never released any SDK regarding to Primitive Shaders and last but not least it´s not working on GFX9 hardware anyway, so why bother ?

Nvidia version seems quite more complex than AMD version and I think it need very certain software support and on top of that new version of API support. I think both companies working hard on that with MS and Khronos as well.
 
NVIDIA Stabilizes Its Vulkan/OpenGL Ray-Tracing Extension
November 1, 2018
It was just in mid-September that NVIDIA introduced its ray-tracing extension for Vulkan as VK_NVX_raytracing with it debuting as an "experimental" feature along with OpenGL/GLSL functionality. Already they seem happy with the design that it's being promoted to stable.

While adding a few last-minute capabilities, the NVX extension is promoted and now firmed up as NV_raytracing. It's still a vendor-specific extension until there is a firm consensus by working group members to promote it as an official extension, but in dropping "NVX" it's considered by NVIDIA to be stable.

This includes SPV_NV_ray_tracing, VK_NV_ray_tracing, and GL_NV_ray_tracing. I am actually surprised they so quickly settled on the stable extension considering it's been less than two months and not all game developers have had time to evaluate it or even get their hands on new NVIDIA RTX hardware.

Presumably NVIDIA will be pushing out new Windows/Linux graphics driver releases soon that feature this stabilized ray-tracing support.
https://www.phoronix.com/scan.php?page=news_item&px=NVIDIA-NV_ray_tracing

 
I am actually surprised they so quickly settled on the stable extension considering it's been less than two months and not all game developers have had time to evaluate it or even get their hands on new NVIDIA RTX hardware.

I'm not surprised that it's been finalized so fast. Remember, it takes *years* to go from initial design to final production for a GPU, so Nvidia would have been working on the API long before now. In fact, RTX/DXR is pretty much a clone of Nvidia's own Optix API, which has been around for close to a decade now. It's probably best to consider "experimental" as Nvidia's way of asking if anyone wants tweaks to be made before the final version, particularly since whatever they decide now will probably shape APIs for many years to come - they need to get it right now.
 
Did you read the next sentence? :p
I did at least, and I'm wondering why would you say "RTX/DXR"? To my understanding they're quite a different thing, one being NV's 'middleware' between their cards and DXR, and DXR isn't a clone of Optix (or any other competing API for that matter) any more than DX12 is a clone of Mantle (which obviously it isn't, even if it did loan a lot of ideas and even implementations).
 
Accelerated Ray Tracing in One Weekend in CUDA
November 5, 2018
But what if you’re curious about how ray tracing actually works? One way to learn is to code your own ray tracing engine. Would you like to build a ray tracer that runs on your GPU using CUDA? If so, this post is for you! You’ll learn more about CUDA programming as well as ray tracing in one fell swoop.
...
Peter Shirley has written a series of fantastic ebooks about Ray Tracing starting from coding the very basics in one weekend to deep topics to spend your life investigating. You can find out more about these books at http://in1weekend.blogspot.com/. The books are now free or pay-what-you-wish and 50% of the proceeds go towards not-for-profit programming education organizations. They are also available on Amazon as a Kindle download.

You should sit down and read Ray Tracing in One Weekend before diving into the rest of this post. Each section of this post corresponds to one of the chapters from the book. Even if you don’t sit down and write your own ray tracer in C++, the core concepts should get you started with a GPU-based engine using CUDA.
https://devblogs.nvidia.com/accelerated-ray-tracing-cuda/
 
Techreport tested the Adaptive Shading patch .. the game now has an option called NVIDIA Adaptive Shading, it has several presets: Performance, Balanced and Quality. And there are no visual differences between any of them.

Wolfenstein_II___3840x2160_average_fps.png


https://techreport.com/review/34269...hading-with-wolfenstein-ii-the-new-colossus/3
 
Last edited:
Correction: The reviewer didn't notice "practically any difference", which means pretty much "there's a difference but you probably won't notice it"
Actually no, if you continue reading, he clearly states: "If there's a catch to having CAS on at this resolution, I didn't see one".
The developer and NVIDIA also state the tech doesn't affect visual fidelity:

Added support for NVIDIA Adaptive Shading on NVIDIA RTX series GPUs. (Improves frame rate by dynamically adjusting the shading resolution in different areas of the screen, without affecting fidelity).
https://steamcommunity.com/app/612880/discussions/0/3104564981118189993/?ctp=2
 
Actually no, if you continue reading, he clearly states: "If there's a catch to having CAS on at this resolution, I didn't see one".
The developer and NVIDIA also state the tech doesn't affect visual fidelity:


https://steamcommunity.com/app/612880/discussions/0/3104564981118189993/?ctp=2
That quote doesn't actually say there isn't a difference (which his earlier statement in the same review says there is), just that for him the diffrences are so minute they don't affect the experience.
PR-speak is PR-speak, there's a difference regardless if one notices it or not, unless you're suggesting the devs actually hand pick pixel by pixel shading fidelity so it actually doesn't change a thing on the screen, but if that would be the case, why would there be several different options to choose from?
 
PR-speak is PR-speak, there's a difference regardless if one notices it or not, unless you're suggesting the devs actually hand pick pixel by pixel shading fidelity so it actually doesn't change a thing on the screen, but if that would be the case, why would there be several different options to choose from?

The setting must control the instruction counts; the higher options execute more NOPs. :yep2:
 
That quote doesn't actually say there isn't a difference (which his earlier statement in the same review says there is), just that for him the diffrences are so minute they don't affect the experience.
Again, no. If there was a difference he would have shown it in screenshots, or talked about it in detail. You are literally changing the meaning of his words and putting new words in his mouth. The guy unequivocally denies spotting any difference.
PR-speak is PR-speak, there's a difference regardless if one notices it or not, unless you're suggesting the devs actually hand pick pixel by pixel shading fidelity so it actually doesn't change a thing on the screen,
I would trust the developer to know more about their tech than anyone else.
why would there be several different options to choose from?
Maybe there could be subtle differences in other scenes and under certain conditions (lower resolution for example), just not in the one review @4K that we have now.
 
Riddle me this: why does a reviewer say "I saw practically no difference in image quality" if he means "I saw no difference in image quality" or "there is no difference in image quality"? I don't think I'm the one twisting his words here, that's the only direct quote on image quality from him.

The fact is, if there was no quality loss there wouldn't be different quality settings, heck, there wouldn't be any settings at all, it would just be always on.
 
I don't think I'm the one twisting his words here, that's the only direct quote on image quality from him.
You definitely are taking his words out of context, he had two quotes on the matter:
I saw practically no difference in image quality at 4K when moving between each preset.
If there's a catch to having CAS on at this resolution, I didn't see one

He literally double confirms the absence of any difference. Add to that the developer confirmation about maintaining fidelity and you have a pretty solid case of: NO there isn't a difference.
Added support for NVIDIA Adaptive Shading on NVIDIA RTX series GPUs. (Improves frame rate by dynamically adjusting the shading resolution in different areas of the screen, without affecting fidelity).

he fact is, if there was no quality loss there wouldn't be different quality settings, heck, there wouldn't be any settings at all, it would just be always on.
Again, there could be differences under specific conditions and under lower resolutions or detail, they could easily be contingencies for running the game at less than max settings, and or low fps. Without input from the developer, you can't jump to conclusions about this, especially when there is a mountain of statements denying the presence of a difference.
 
Last edited:
Nope. Words have meanings. In particular, "practically".
 
Back
Top