Next gen lighting technologies - voxelised, traced, and everything else *spawn*

BTW, regarding the VK q2 PT video, some interesting facts from my conversation with Christoph:
1. It turns out there was is little overhead to ray tracing instead
of rasterization. According to Christoph, tracing the primar visibility
costs around 2ms at 1440p on RTX2080Ti.

My ancient Quake renderer does primary visibility (including filtered texturing lightmap/material blending) at 3ms for 4K, not even using a GPU :)
Thus using pure software CPU rendering. The bottleneck is not even the CPU (8 core) but the PCI bandwidth.
http://users.skynet.be/fquake/
Just to indicate how rediculous simple the geometry and small the textures are, everything fits in on chip caches.
 
Last edited:

creator description:
All settings are maxed out. FPS are below 30 but with normal graphic settings and without recording it's actually playable with up to 50-70 fps. Running on a GTX 970 with an i7 4790k and 16GB DDR3 RAM. Considering this old PC, it runs very good!

Nvidia RTX Cards (RTX 2060, RTX 2070, RTX 2080, RTX 2080 Ti) will perform as good as non-rtx cards (GTX 1060, GTX 1070, GTX 1080, GTX 1080 Ti)
The Shader is not using any of the new rtx cores.

This Shader includes:
- Fully Pathtraced Global Illumination (with infinte bounces)
- Fully Raytraced reflections (+roughness) No Screenspace.
- Glass Block refractions
- Support for PBR textures

No shader download link, this is an experimental version. Support Sonic Ether's awesome work on Patreon! https://www.patreon.com/sonicether (Gold Rank Supporters have access to the Experimental library) Version used in the video: SEUS PTGI E5
 
Last edited:
That Minecraft lighting shows the difference RT should be bringing to games and how valuable it'll be in the long run. The Minecraft voxels are of course a pretty (perfectly?) ideal case for raytracing!

An RTX optimised version will be a good reference point.
 
That Minecraft lighting shows the difference RT should be bringing to games and how valuable it'll be in the long run. The Minecraft voxels are of course a pretty (perfectly?) ideal case for raytracing!

An RTX optimised version will be a good reference point.
Yeah looks fantastic, really impressive stuff especially with no RTX.
 
Beautiful!
Found some details here: https://www.patreon.com/posts/seus-ptgi-update-22356734
He mentions surface caching in the voxels to get free infinite bounces, and nice shots to show the difference.
He also mentions flickering, which likely means oszillations in the resulting iterative solve. IIRC this is more likely with jacobi method but rare with gauss seidel, stochastic updates help as well. I remember similar problems, even explosions in extremely bright scenes. Quite fun to see such issues in gfx, it's typically expected in physics engines for example.
No longer fakery - real simulation :)
 
Sonic Ether is the same guy who had created SEGI, an unity asset for real time dynamic GI in unity. It voxelized the whole scene every frame, cone-traced it, and temporally filtered the results. He later abandoned that project as according to him it's complexity ballooned outside of his ability to cope. I hope he can someday approach traditional (non block based) scenes again someday, with the lessons learned with his Minecraft mod. His SEGI source was made public anyway and others have attempted to work on it since then.
 
Sonic Ether is the same guy who had created SEGI, an unity asset for real time dynamic GI in unity. It voxelized the whole scene every frame, cone-traced it, and temporally filtered the results. He later abandoned that project as according to him it's complexity ballooned outside of his ability to cope. I hope he can someday approach traditional (non block based) scenes again someday, with the lessons learned with his Minecraft mod. His SEGI source was made public anyway and others have attempted to work on it since then.

There are quite a few guys working on real time dynamic GI and RT solutions for Unity at the moment (Unity is also working on something..more @ GDC..). Lexi Dostal's HXGI is starting to look mighty impressive (running on a 970m laptop GPU):


Raphael Ernaelsten's Aura 2 plugin now feature Volumetric GI:
 
Last edited:
That Minecraft lighting shows the difference RT should be bringing to games and how valuable it'll be in the long run. The Minecraft voxels are of course a pretty (perfectly?) ideal case for raytracing!

An RTX optimised version will be a good reference point.
Yea I didn’t exactly post my opinion on it. But where I thought was the hidden awesomeness is that this is just a mod. And it occurred to me that we’re seeing a lot of these old games being bolted with RT wasn’t some sort of coincidence, it would appear that if the render pipeline is straight forward enough that RT can be put in with relative ease (as there are no hacks to work around) compared to what we saw with battlefield and some of the new AAA titles that are having issues integrating RT quickly (other discussion thread)

I also felt that in terms of influence that the idea that we could have some really amazing quality graphics from indie teams because the effect of lighting can be so dramatic even against graphical cube ala minecraft.

It reinforces my opinion on how costs for graphics can come down dramatically if they are able to fully commit to a RT pipeline in the future. Looking at minecraft for instance and comparing it to AC Unity with baked GI, one is made by one person dynamically for the whole world, the other requires a massive team to pull off.

As for the BVH acceleration; I looked into 1990+ research papers from Microsoft. I found 2 papers on RT (only 2 sadly) and in their summaries on how to dramatically accelerating RT without hardware. It was to create a data structure structure to parallelize rays as much as possible, and it’s description just came across as being a BVH structure. Which nvidia accelerated in Turing.

If minecraft can run that well without BVH acceleration, one needs to consider the points made by the devs earlier on this board that the engines need to be re-written To get great performance out of RT.
 
Last edited:
Impressive, nice to see RT taking off. Cant wait to see it in action with hw rt cores :p

The RT cores might not add much benefit in this special case of a world voxelized into huge voxels.
No need for a BVH to accelerate tracing, as such a coarse voxel grid can be directly raymarched pretty fast.
 
It reinforces my opinion on how costs for graphics can come down dramatically if they are able to fully commit to a RT pipeline in the future. Looking at minecraft for instance and comparing it to AC Unity with baked GI, one is made by one person dynamically for the whole world, the other requires a massive team to pull off.

This isn't how it really works..only a handful of devs work on lighting, another handful on animation, another small team on shaders etc . RT is not a magical solution at all and you don't light a full game or a movie scene just by using one light source or an HDR map. Every VFX studio moved to RT ( Pixar being the last one) and it didn't decrease workforce or even make thing's "easier". The main point of RT was to bring physically accurate lighting instead of hacking your way around with crap like AO. You still have to place your point/spot/area/etc lights just as you did before. The only tangible advantage for real-time RT is the time saved by not baking lightmaps (which are also getting faster as this can now be GPU accelerated in some engines) if for some odd reason the whole game's main light source is constantly moving & fully dynamic (which is ridiculous, we bake GI in most VFX scenes for non dynamic content to save rendering time because there's no need at all to recalculate it for each frame).
 
It reinforces my opinion on how costs for graphics can come down dramatically if they are able to fully commit to a RT pipeline in the future. Looking at minecraft for instance and comparing it to AC Unity with baked GI, one is made by one person dynamically for the whole world, the other requires a massive team to pull off.
I don't think so. The costs are mainly in content creation, and nothing will change here. (... have already been said, i see)

The RT cores might not add much benefit in this special case of a world voxelized into huge voxels.
No need for a BVH to accelerate tracing, as such a coarse voxel grid can be directly raymarched pretty fast.
I also think RT cores won't help here. Example is VCT, where reflections take 1-2 ms to trace (one cone per pixel) Also ray - triangle test is not necessary so no benefit.
Minecraft makes things very easy here, also global parametrization for the caching is not necessary.
 
This isn't how it really works..only a handful of devs work on lighting, another handful on animation, another small team on shaders etc . RT is not a magical solution at all and you don't light a full game or a movie scene just by using one light source or an HDR map. Every VFX studio moved to RT ( Pixar being the last one) and it didn't decrease workforce or even make thing's "easier". The main point of RT was to bring physically accurate lighting instead of hacking your way around with crap like AO. You still have to place your point/spot/area/etc lights just as you did before. The only tangible advantage for real-time RT is the time saved by not baking lightmaps (which are also getting faster as this can now be GPU accelerated in some engines) if for some odd reason the whole game's main light source is constantly moving & fully dynamic (which is ridiculous, we bake GI in most VFX scenes for non dynamic content to save rendering time because there's no need at all to recalculate it for each frame).
The best tangible advantage for RTRT is not having to choose between a dynamic world and high quality lighting. We can have both at the same time now.
 
If a 970m can do it, why arent we seeing more modern games with rt? Is BF5 doing something else here? Yes bf5 rt looks better then minecraft raytracing but their complete different games. Minecraft looks worse then quake so its not fair, its like a retro game fitted with rt.
 
If a 970m can do it, why arent we seeing more modern games with rt? Is BF5 doing something else here? Yes bf5 rt looks better then minecraft raytracing but their complete different games. Minecraft looks worse then quake so its not fair, its like a retro game fitted with rt.
Minecraft is a regular grid with a single material, BFV is irregular triangle meshes with multiple materials. But as i said earlier, we would have approached realtime RT in any case, also without FF hardware.

You can look this for something in between:

This used triangles binned to a regular grid as acceleration structure. It's realtime, but low FPS.

Edit: Neither Quake nor manicraft have geometric complexity comparable to BFV.
 
This isn't how it really works..only a handful of devs work on lighting, another handful on animation, another small team on shaders etc . RT is not a magical solution at all and you don't light a full game or a movie scene just by using one light source or an HDR map. Every VFX studio moved to RT ( Pixar being the last one) and it didn't decrease workforce or even make thing's "easier". The main point of RT was to bring physically accurate lighting instead of hacking your way around with crap like AO. You still have to place your point/spot/area/etc lights just as you did before. The only tangible advantage for real-time RT is the time saved by not baking lightmaps (which are also getting faster as this can now be GPU accelerated in some engines) if for some odd reason the whole game's main light source is constantly moving & fully dynamic (which is ridiculous, we bake GI in most VFX scenes for non dynamic content to save rendering time because there's no need at all to recalculate it for each frame).
Agreed.
Workloads aren't equal so it's a bit much to make that a basis point.
But baking light maps should be a lot of work for video games, scaling in open world games makes for a lot of content to do especially with all the changes that can happen, the amount of re-work involved is massive.

And then we get into dynamic lights/moving light sources, which is also a thing.
 
Agreed.
Workloads aren't equal so it's a bit much to make that a basis point.
But baking light maps should be a lot of work for video games, scaling in open world games makes for a lot of content to do especially with all the changes that can happen, the amount of re-work involved is massive.

And then we get into dynamic lights/moving light sources, which is also a thing.
Baking lightmaps for static environments isn't that time consuming or complicated (and are well suited for ToD scenarios etc..). RT can be used for dynamic objects and assets that have to be updated frequently or each frame. Having your whole scene and all assets fully RT'd for each frame is quite ridiculous and only good to show off in tech demos or showcase PR materials etc. In the Star Wars Reflection demo the world/set lighting was baked only reflections where RT (and area light shadows in the early first part of the GDC demo where you on have Storm Troopers models in an empty room).
 
The only tangible advantage for real-time RT is the time saved by not baking lightmaps (which are also getting faster as this can now be GPU accelerated in some engines) if for some odd reason the whole game's main light source is constantly moving & fully dynamic (which is ridiculous, we bake GI in most VFX scenes for non dynamic content to save rendering time because there's no need at all to recalculate it for each frame).

Like the sun moving in any outdoor open world game with dynamic TOD. Like so many modern games. Not so ridiculous actually.
 
Like the sun moving in any outdoor open world game with dynamic TOD. Like so many modern games. Not so ridiculous actually.
Which is something that has be successfully done for years using light probes without the gigantic performance hit of pure RT...
Crappy compressed videos of a 3 years old game:



or tons of RDR2 time lapse videos on YouTube
 
Back
Top