Next gen lighting technologies - voxelised, traced, and everything else *spawn*

Why are results from the same people far lower than the OptiX 5 results?

Previous Benchmark, OptiX 5.
Titan V gets 108 M samples/s.
1080Ti gets 55.

fermat_bathroom.png



Latest benchmark, OptiX 6.
Titan V gets 67 M samples/s.
1080Ti gets 27.

fermat_bathroom.png


The only difference I can see is 'custom settings : 512 pass' whatever that means.
Because, as I posted above, the RTX path in OptiX6 which is now "recommend optimal path" by Nvidia for non Turing GPUs is actually the opposite... Not "optimal" at all..
 
Because, as I posted above, the RTX path in OptiX6 which is now "recommend optimal path" by Nvidia for non Turing GPUs is actually the opposite... Not "optimal" at all..
Did you even watch the video? The Titan V is faster with RTX ON, the 1080TI is either faster with it on, or the same speed.
 
? OptiX6 with RTX path on non-Turing GPUs is slower than OptiX5 & OptiX6 with RTX path off for the same end result.. What's strange here.. I wonder
Again that's not what the video shows, The Titan V is faster with RTX ON, the 1080TI is either faster with it on, or the same speed.

Why are results from the same people far lower than the OptiX 5 results?
The only difference I can see is 'custom settings : 512 pass' whatever that means.
512 pass is lower quality I presume than standard.
 
The thing I find most shocking is none of the RTX content using raytracing is looking that good. This is on a 750Ti:

"It runs at 40fps, 1920x1080 on my GTX 750 ti, with 3 coherent and 2 incoherent rays per-pixel (so about 300 million rays a second)."

No amount of RTX BVH hardware in a 750Ti would get results like this. It does seem a more game-friendly approach in terms of what's possible. Raytracing is the ideal for offline where quality is priority one, but if the volumetric approaches can look this good while being this fast, they seem a better starting point, working to solve their limitations in future hardware, than trying to get raytracing fast enough.
 
The thing I find most shocking is none of the RTX content using raytracing is looking that good.
But you could use RTX to make stuff that looks that good, ofc. And it would be faster.
On the other hand: You could also make this stuff faster using clever denoising, caching, whatever...
At the end is the question: How much RT work do we really need per frame, and how much chip area is it worth to dedicate for the speed up? Even if DXR/RTX would be to my liking, it would still ask for this.
 
The thing I find most shocking is none of the RTX content using raytracing is looking that good. This is on a 750Ti:

"It runs at 40fps, 1920x1080 on my GTX 750 ti, with 3 coherent and 2 incoherent rays per-pixel (so about 300 million rays a second)."

No amount of RTX BVH hardware in a 750Ti would get results like this. It does seem a more game-friendly approach in terms of what's possible. Raytracing is the ideal for offline where quality is priority one, but if the volumetric approaches can look this good while being this fast, they seem a better starting point, working to solve their limitations in future hardware, than trying to get raytracing fast enough.

The question then becomes, why don’t we see it in any game other than TTC? And even that didn’t look quite this good, save for the noise?
 
It's an interesting discussion. Back through this or the Turning impact on consoles thread, we found one game using volume-tracing and it ended up really slow in games versus early prototypes. But it was years old and there wasn't much (any?) momentum to develop and explore this tech. Also, it's just rendering the seen and not running a game with surface shaders etc. The 750 Ti released 5 years ago, so the hardware for this exact demo isn't new, and it's just been waiting for people to develop it. Which makes you wonder how much better at volume-tracing modern and future hardware will be.
 
The thing I find most shocking is none of the RTX content using raytracing is looking that good. This is on a 750Ti:
The StarWars demo and Dancing Robot have higher quality. Imagine what you can do on a 2080Ti with BVH acceleration.

That's why I think games developed for ray tracing from the ground up are going to offer striking visuals at very high performance. We are limited to games with mostly rasterization elements for now.
 
The question then becomes, why don’t we see it in any game other than TTC? And even that didn’t look quite this good, save for the noise?
Hahaha, i know why:

Because of FF raster hardware devs are so focused on using it most efficiently - they just accept the restrictions after some time. No time to go beyond - deadlines, risk of failure...
And now history repeats. We have nothing learned.

The StarWars demo and Dancing Robot have higher quality.
No. They use precomputed GI and RT only for reflections. The higher quaility comes from artist work and presentation.
That's why I think games developed for ray tracing from the ground up are going to offer striking visuals at very high performance.
No. RT alone is no solution for anything. I don't want games that show perfect reflections but still rely on baked GI, and GPU already struggles at small scenes. The raster elements are no limitation - they help to make it barely fast enough.
Why do you see raster as 'limited', although RTX is limited at least as much too?
If you want unlimited stuff, then you are welcome to join my philosophy ;)
 
No. They use precomputed GI and RT only for reflections. The higher quaility comes from artist work and presentation.
But yeah, they aren't doing path-traced GI lighting
Dancing Robot uses RT GI.


Sol runs in real-time while being fully ray-traced. Lights. Reflections. AO. GI. Everything is ray-traced.

The purpose of the demo was first and foremost to show real-time ray tracing. As mentioned, everything is ray-traced. Lights, AO and so on, runs in real-time while being raytraced on a single NVIDIA RTX 2080Ti.
https://www.allegorithmic.com/blog/...uture-nvidia-s-project-sol-textured-substance

Why do you see raster as 'limited', although RTX is limited at least as much too?
It's limiting the amount of optimization possible for ray tracing, developers optimizing for rasterization only.
 
Last edited:
It's an interesting discussion. Back through this or the Turning impact on consoles thread, we found one game using volume-tracing and it ended up really slow in games versus early prototypes. But it was years old and there wasn't much (any?) momentum to develop and explore this tech. Also, it's just rendering the seen and not running a game with surface shaders etc. The 750 Ti released 5 years ago, so the hardware for this exact demo isn't new, and it's just been waiting for people to develop it. Which makes you wonder how much better at volume-tracing modern and future hardware will be.
Games are very different beasts from
A tech demo like this.

I mean, the population is giving hell to BioWare for loading screens. Lol.

There is only so much you can do in a scene, you have all sorts of bottleneck limits per second.
 
Dancing Robot uses RT GI.
Oh, ok, I see it. It updates as fast as my on stuff, if i run the CPU version of it. :)

It's limiting the amount of optimization possible for ray tracing, developers optimizing for rasterization only.
Devs can not optimize fixed function hardware, so what do you consider as 'optimizations'? Removing smaller dynamic objects? Limiting to a single light and bounce? Mix with SSR?
That's no RT optimizations! It's only optimizing the application of it.
 
Just because the CEO of a company says his product is awesome, it doesn't mean it is awesome.
Even if Jenson says he used alien technology recovered from the Area 51 crash site, the images speak by themselves.

MOD: Please don't spam image dumps. Present arguments with images as evidence.

Capturertwerewrwe.JPG

Captureewqewqq.JPG

Capture42343243232.JPG

image.png.580abe9c4c7102719bd5442d0b48f79a.png

rewrewre.JPG

Capturetyertretre.JPG

6546546.jpg


ac23f1ca1da3ed.jpg
 
Last edited by a moderator:
...ok (?)
Contains some valid points, but... RTX on does not mean everything can be assumed to be correct, because RT is only used for certain effects.
E.g. Metro uses RT only for the skylight, no other light sources, and smaller dynamic objects are not raytraced at all, thus the red circles.
The UE4 demo uses RT only for reflections i guess, so shadows do not change.

Some very nice voxel AO, they show GI / reflections too on webpage:
 
Back
Top