Next gen lighting technologies - voxelised, traced, and everything else *spawn*

Indeed. That's the difference in appearance between old-style games and what we'd expect in a next-gen look. Although not too dissimilar from The Tomorrow Children - that game really was ahead of the curve!
The materials could use tweaking though. The plasticness of the 'grassy' ground is unwarranted.
 
Indeed. That's the difference in appearance between old-style games and what we'd expect in a next-gen look. Although not too dissimilar from The Tomorrow Children - that game really was ahead of the curve!
The materials could use tweaking though. The plasticness of the 'grassy' ground is unwarranted.
Very different way it works though from what I have asked so far!
By the way - anyone have any easily digestible questions for the author of that minecraft grouping of RT shaders? We are doing a more light-hearted video on it.
 
anyone have any easily digestible questions
I'd like to know those things:
At what resolution relative to a voxel is the GI calculated, and does it decrease at distance? (Likely 'GI' is better term than 'Irradiance' or 'Incoming light'? Although the term GI is only loosely defined but mostly reduced reduced to diffuse interreflection)
Does specular denoising happen in screenspace or texture space?

I think you should mention advantages of 'infinite bounces' and 'any lightsource contributes / casts shadows' (if true).
The bounces thing is most remarkable here - i've never seen this in realtime before. People should know this to develop a better understanding about GI.
 
Shadow maps are mostly very basic in games. In my opinion Raytracing shadows are the only performant solution which can fix this. Achieving this quality and depth with shadow maps would be much more expensive.

The problem is that with shadow maps, you basically have to render a depth cubemap (if you want the light to cast shadows in all directions - sunlight doesn't need this) for every single light source. There really just isn't any good way to do this while maintaining both quality and speed, not to mention memory consumption.

Ray tracing has the opposite problem in that for a given pixel, you must cast a ray toward every light source that might cast a shadow there. Though at least for this case, you can do culling and only look for shadows from a small set of nearby or very bright lights.

Shadow maps have no such culling. Yes, you can avoid shadow map lookups for distant lights when shading, but you still have to generate the darn things for every single light that can cast shadows anywhere in the visible scene.

At some level of detail, ray tracing passes shadow maps in terms of performance.
 

The problem is that with shadow maps, you basically have to render a depth cubemap (if you want the light to cast shadows in all directions - sunlight doesn't need this) for every single light source. There really just isn't any good way to do this while maintaining both quality and speed, not to mention memory consumption.

Ray tracing has the opposite problem in that for a given pixel, you must cast a ray toward every light source that might cast a shadow there. Though at least for this case, you can do culling and only look for shadows from a small set of nearby or very bright lights.

Shadow maps have no such culling. Yes, you can avoid shadow map lookups for distant lights when shading, but you still have to generate the darn things for every single light that can cast shadows anywhere in the visible scene.

At some level of detail, ray tracing passes shadow maps in terms of performance.
And that's without taking softness into consideration.
 
Shadow maps have no such culling.

Generally, but there is nothing really stopping you with irregular z-buffers ... just don't transform those pixels to the light's point of view.

Irregular z-buffer rendering is just a form parallel raytracing and in ideal circumstances the more optimal algorithm for parallel or diverging rays than more traditional forms of parallel raytracing (you throw the BVH and tris at rays, instead of throwing rays at the BVH and tris). The main limitation is lack of finegrained control over the command buffer to allow GPU side occlusion culling (and on the fly animation and tesselation).
 
Last edited:
more Shadow Of Tomb Raider RTX comparisons, shots with more shadows are RTX On

Webp-net-gifmaker-2.gif

https://i.postimg.cc/ZqQPwYM6/Webp-net-gifmaker.gif
https://i.postimg.cc/VL6VL4Fz/Webp-net-gifmaker-2.gif
https://i.postimg.cc/NjgxMzX2/Webp-net-gifmaker-1.gif
https://i.postimg.cc/BbD8jf92/Webp-net-gifmaker-2.gif
https://i.postimg.cc/YCt4CPXm/Webp-net-gifmaker-3.gif
https://i.postimg.cc/QC32Wc5t/Webp-net-gifmaker-2.gif
https://i.postimg.cc/j2yJMYLc/Webp-net-gifmaker-5.gif
https://i.postimg.cc/QMxW0z1r/Webp-net-gifmaker-7.gif
https://i.postimg.cc/c1Vw0XyQ/Webp-net-gifmaker-8.gif
On the other hand you get these horrible "translucent shadows" which look like they could be from some PS1-era game when there's overlapping: http://images.nvidia.com/geforce-co...traced-translucent-shadows-003-on-vs-off.html
 
On the other hand you get these horrible "translucent shadows" which look like they could be from some PS1-era game when there's overlapping: http://images.nvidia.com/geforce-co...traced-translucent-shadows-003-on-vs-off.html
I found how good they look is very much so dependent on the actual asset... and partially the time of day setting/distance and type of light hitting the asset. Since the shadows attenuate with umbra/penumbra, they honestly look a bit awkward in moments where there is scarcely a difference between umbra and penumbra. Just hard, overlapping simplistic transparency shadows. Maybe that scene has some strange time of day setting making them hard edged? Other scenes they look quite great:

Shadow maps:
shadowmapsetkm1.jpg


Ultra transparency shadows:
transparency3u0j7r.jpg


Shadow maps:
shadowmaps21pj87.jpg


Ultra transparency shadows:
transparency1akjj3.jpg


Shadow maps:
shadowmaps346k4s.jpg


Ultra Transparency Shadows:
transparency4snks7.jpg


Once again - for every one instance where something can look less than perfect, there are many more where it does in fact look quite a bit better.
 
Last edited:
Again, that's properly next-gen.
On the other hand you get these horrible "translucent shadows" which look like they could be from some PS1-era game when there's overlapping: http://images.nvidia.com/geforce-co...traced-translucent-shadows-003-on-vs-off.html
The gains vastly outweight the losses. Object composition is significantly more grounded all round. The proper shadowing in this example fixes the 'gamey' look, which is caused by inconsistent lighting breaking expectations.
https://i.postimg.cc/QMxW0z1r/Webp-net-gifmaker-7.gif
 
Perhaps it would be less of a problem once devs keep those errant cases in mind although... more dev time needed then?

Bug Number: “remove offending objects entirely” :V
For a bolt on? After a title shipped? It’s pretty nuts they went as far as they did I suppose. Where games are built ground up with DXR in mind, the discussion is going to be interesting.

On a related topic, BioWare and frostbyte, they indicated it took 24 hrs to bake lighting. If they focus on building everything with DXR and bake everything at the end, I think there are some cycles that can be saved. Has to be better than baking, and later changing and baking again. Etc.
 
On a related topic, BioWare and frostbyte, they indicated it took 24 hrs to bake lighting. If they focus on building everything with DXR and bake everything at the end, I think there are some cycles that can be saved. Has to be better than baking, and later changing and baking again. Etc.

It's not a little amusing that baking time was something that also plagued Destiny's iteration times (in the context of BioWare supposedly not even analyzing Destiny. >_> )

:|
 
If 24 hours is too long to bake lighting for a semi-open world map, then why wasn't that identified early, and why wasn't that part of the tooling optimized? Baking lights is typically an hours long process, from what I understand. It can't be the case that the tools can't be optimized. If baking lights takes astronomically longer than industry standards from Unreal, or other studio engines, then why not fix it early when your design concept revolves around a large planet that could be explored (the original concept). If I paid someone to build me a house, and it ended up being a huge piece of shit and I called out the contractors for the poor job they did, their explanation better not be that they didn't have the right tools.
 
That's really impressive, it is indeed next-gen. It's resource hungry though, fps from 60+ to 30+ in you're case. What GPU are you running?
Can't wait for games build with DXR from the ground up!
 
NVIDIA gave PCGH an updated version of their StarWars demo that uses DX12 and runs on Pascal too. PCGH tested Pascal GPUs @1440p native, versus Turing GPUs @2160p/DLSS. Which means Turing is at a moderate disadvantage here, as DLSS still incurs a performance loss even though it actually renders @1440p.

TitanXP: 11 fps
Titan RTX: 36 fps

1080Ti: 11 fps
2060: 17 fps
2070: 21 fps
2080: 27 fps
2080Ti: 35 fps

http://www.pcgameshardware.de/Nvidi...pecials/Pascal-Raytracing-Benchmarks-1278592/
 
Back
Top