Next gen lighting technologies - voxelised, traced, and everything else *spawn*

What's the possibility or a hybrid approach to reflections? When buildings and terrain geometry is static, is it feasible to apply a high quality cube map to a window for example, then overlay Ray traced dynamic objects like animated models on top of that? I could imagine it would be hard to get look right, you would have to have the ability for rays to ignore certain pixels based on geometry and no idea if it would be a significant increase n performance to be worth it.
I wouldn't be surprised seeing something like that. Already developers mix static cubrmaps with dynamic SSR.

Texture space will not do much for reflections, unfortunately.
Sharp reflections are the easy case, and it's hard to see why they have such a massive impact on performance.
Nvidia really needs to rethink their GPU architecture if they are serious about raytracing in games.
It will be interesting to see what Intel comes up with, given their previous Larrabee raytracing experience.
Or DICE should rethink their rendering pipeline, which they are already doing as exemplified by their Halcyon engine.
 
On the topic of laggy reflections, we've already seen them in action in the NVIDIA demo called SOL:


It's not that bad.

We've also seen the spatial-only denoiser in this other demo:


Pretty good.

Here's a presentation that goes into some detail on the different denoisers NVIDIA uses right now. The reflections part starts at minute 18 (including a listing of problems with current rasterization techniques):

http://on-demand.gputechconf.com/si...u-low-sample-count-ray-tracing-denoisers.html

My favorite is actually the path tracing part :devilish:
 
On the topic of laggy reflections, we've already seen them in action in the NVIDIA demo called SOL:


It's not that bad.

We've also seen the spatial-only denoiser in this other demo:


Pretty good.

Here's a presentation that goes into some detail on the different denoisers NVIDIA uses right now. The reflections part starts at minute 18 (including a listing of problems with current rasterization techniques):

http://on-demand.gputechconf.com/si...u-low-sample-count-ray-tracing-denoisers.html

My favorite is actually the path tracing part :devilish:

We have been presented a lot of videos produced by Nvidia, to show off how good and spectacular the new raytracing hardware works.
In the past there have been lot's of breakthroughs where GPUs enabled new levels of graphics realism.
At each point together with the release of the such a breakthrough GPU generation, these spectacular demos were made available for the graphics enthusiast to run and enjoy them on their own brand new GPU and be amazed by the new levels of real-time rendering quality.
A list of all those generations and accompanying demos (which can all still be downloaded) Nvidia provides:
https://www.nvidia.com/coolstuff/demos

So why are all those new impressive raytracing demos not available now to download and run on available RTX Geforce cards ?
 
Because nVidia has learned that the backslash isnt worth the release.
That is the result when the consumer has stopped to care about better graphics and are more interested in more pixel.
 
Texture space will not do much for reflections, unfortunately.
Sharp reflections are the easy case, and it's hard to see why they have such a massive impact on performance.
Nvidia really needs to rethink their GPU architecture if they are serious about raytracing in games.
It will be interesting to see what Intel comes up with, given their previous Larrabee raytracing experience.
Maybe NV should have invested first in a redesigned memory subsystem, before throwing billions of transistors for dedicated DNN/RT logic in overpriced consumer SKUs?
Yet another iteration of the old GDDR memory is not cutting it, as it seems, and HBM costs is not going down fast enough.
 
Because nVidia has learned that the backslash isnt worth the release.
That is the result when the consumer has stopped to care about better graphics and are more interested in more pixel.
Backlash of what? What'll the backlash be for RTX owners getting to run these demos on their new, expensive GPUs?

As for not caring about quality, certainly the results here so far are that plenty of PC gamers prefer quality visuals. Pixel counting isn't a priority.
 
Because nVidia has learned that the backslash isnt worth the release.
That is the result when the consumer has stopped to care about better graphics and are more interested in more pixel.
If consumers would not care there would not be a point to show those videos, the demo videos are still posted here by insiders like @OCASM
The demo's are here so why not release them to the public?
In the past I would run to the shop to buy latest high end cards just to marvel at those demos:
https://www.nvidia.com/coolstuff/demos
(Also for cards of AMD BTW)
Now instead of at least being impressed by actual downloadable tech demos, people are disappointed by first games implementing raytracing technology.
 
Last edited:
Backlash of what? What'll the backlash be for RTX owners getting to run these demos on their new, expensive GPUs?

As for not caring about quality, certainly the results here so far are that plenty of PC gamers prefer quality visuals. Pixel counting isn't a priority.

Havent you heard about Battlefield 5? And now think about demos which can only run on RTX2080TI in 1080p/30fps at best. I can see the clickbait youtube videos and how they claiming that
Turing is to slow for Raytracing in games.

And those polls dont say anything. You have now the option to play Battlefield 5 in 1080p/60fps with DXR "Low" or 2160p/60FPS without on a RTX2070. Lets see how many people would chose
better looking reflections over 4k...
 
We have been presented a lot of videos produced by Nvidia, to show off how good and spectacular the new raytracing hardware works.
In the past there have been lot's of breakthroughs where GPUs enabled new levels of graphics realism.
At each point together with the release of the such a breakthrough GPU generation, these spectacular demos were made available for the graphics enthusiast to run and enjoy them on their own brand new GPU and be amazed by the new levels of real-time rendering quality.
A list of all those generations and accompanying demos (which can all still be downloaded) Nvidia provides:
https://www.nvidia.com/coolstuff/demos

So why are all those new impressive raytracing demos not available now to download and run on available RTX Geforce cards ?
Who knows? Why don't you ask them?

If consumers would not care there would not be a point to show those videos, the demo videos are still posted here by insiders like @OCASM
The demo's are here so why not release them to the public?
In the past I would run to the shop to buy latest high end cards just to marvel at those demos:
https://www.nvidia.com/coolstuff/demos
(Also for cards of AMD BTW)
Now instead of at least being impressed by actual downloadable tech demos, people are disappointed by first games implementing raytracing technology.
I'm not disappointed, I'm actually impressed. Those disappointed are people who just want console-level graphics running at a higher framerate/resolution.
 
Who knows? Why don't you ask them?

I'm not disappointed, I'm actually impressed. Those disappointed are people who just want console-level graphics running at a higher framerate/resolution.

Any suggestion who to address ?
I looked for a suitable raytracing entry on
https://forums.geforce.com/
But all I see is complaints about broken RTX cards and people complaining about poor BFV raytracing performance
 

I disagree with this being creative or cool. This is total bullshit, and my main reason of criticizing RTX.

Remember when people started to use pixel shaders for other purposes? How horrible, inefficient workarounds they had to use just to utilize GPU power?

It already happens again... this is why some NV folks say 'raytracing is the new compute' - pah! We are back to the stone age of GPUs, and of course NV likes to see that, because they took the lead in sending us back in time, and they are the only ones who profit from that.


All this guy is doing is seeking for workarounds to utilize traversal hardware. Don't you think he could do better algorithms if he had direct access to this feature, beyond the RT restrictions and black boxes?


Q: Why is NV boycotting its own invention, namely GPGPU? They have new options to generate work on GPUs for raytracing and procedural mesh generation. Why do they not expose this to general purpose compute? Why locking up a fundamental building block like tree traversal?
A: Because they do not want us to be software developers or inventors. They want us to become their USERS, limited and tied to their products.
 
It'll then be up to Intel and AMD to come up with powerful enough general purpose hardware that's better.
 
It'll then be up to Intel and AMD to come up with powerful enough general purpose hardware that's better.
It is far from proven that these RT tricks would be any faster/better than existing GPGPU algorithms of course :)
 
Last edited:
Is real-time RT fundamentally compatible (or even plasuble) with the mostly latency-tolerant graphics architectures, or it would require too much redesigning of the GPU concept to be cost efficient and still being generally programmable?
 
Is real-time RT fundamentally compatible (or even plasuble) with the mostly latency-tolerant graphics architectures, or it would require too much redesigning of the GPU concept to be cost efficient and still being generally programmable?

I'm no hardware designer, but i have no doubts exposing traversal hardware and work generation to compute would be very very easy. (But it would tell a bit about NVs secret sauce here, likely, so i totally understand their course - it's just not good fur us.)
Going a step further, e.g. being able to implement custom trees or bounding box lookups instead ray intersections / custom geometry / merging LOD with traversal and geometry like voxels / this might be another story now with RT cores already made for centuries old 'classical raytracing'.
 
Back
Top