Next gen lighting technologies - voxelised, traced, and everything else *spawn*

If they are in fast motion it's not enough duration to capture the shadows
Not plausible, temporal aggregation is only a split second.
It's just that most lighting in this game isn't actual light sources, but pure ambient. So in most scenes, there simply isn't light source to cast a shadow in the first place.

Also, the occasional grain from the non-denoised bits are oddly endearing.
Right, surfaces with noisy normals don't properly interpolate. Similar to what @JoeJ already predicted for high frequency geometry also applies to plain normal mapping.
 
Not plausible, temporal aggregation is only a split second.
It's just that most lighting in this game isn't actual light sources, but pure ambient. So in most scenes, there simply isn't light source to cast a shadow in the first place.

Yeah, but that's just one more factor - likely some smaller soft and contact shadows get washed away by the filter too.

Reminds me i have argued a lot about this with another dev today. He was not impressed and marked a lot of 'errors' in screenshots.
The better the tech becomes, the harder it is to classify wrong or right and reasons... but that's progress as well.
 
It's just that most lighting in this game isn't actual light sources, but pure ambient. So in most scenes, there simply isn't light source to cast a shadow in the first place.
Yeah, I can testify to that. If the enemy is close to a light source his shadow is present, even if he is far away.
 
I guess there is a console command to turn off textures. Maybe it still works in this demo and only the lighting might keep visible.
 
Reminds me i have argued a lot about this with another dev today. He was not impressed and marked a lot of 'errors' in screenshots.
Yeah, it's not at all consistent. When it's good, it's good, but there's lots of flat shaded and disconnected situations. If, as Ext3h suggests, a lot of the lighting is just 'ambient', then the lighting would break. You can't get a good, solid GI without at least secondary bounces. Ideally a path-traced game will do away with ambient light and have light-sources. That's going to be what makes the difference between RT games and what we have now.
 
as Ext3h suggests, a lot of the lighting is just 'ambient', then the lighting would break. You can't get a good, solid GI without at least secondary bounces.
The demo version is mostly corridors with lots of neon lamps, there is an outside area devoid of artificial light sources, relying on ambient lighting from a distant sky box. This is the area with the ramp.

I noticed that @1440p, the far shadows in the corridors are really thin and soft, while moving the denoising washes large parts of them.
 
You can't get a good, solid GI without at least secondary bounces.
For interior you really need at least 5 but often 10. This is really the argument for surface caching to get infinite bounces for free. But caching adds a spatial discretization.
It is also why 'classsic' path tracing alone still sounds a very bad idea to me for games. We need to combine multiple techniques. The promise RT would make things easier will not hold.

The demo version is mostly corridors with lots of neon lamps, there is an outside area devoid of artificial light sources, relying on ambient lighting from a distant sky box.
The game is quite bright for just one bounce - maybe they add a constant ambient term as well? Also, looking at indirect light from explosions i'm not sure if they do not use 2 bounces. (or they use an extra bounce for explosion lights, which would explain the mentioned perf. drop)
 
It’s understandable not to initially be impressed that a 20-year-old game is struggling to hit 60 FPS on the latest hardware but as Schied highlights, the limiting factor of path tracing is not primarily ray tracing or geometric complexity. Instead, it’s the number of indirect light scattering computations and light sources that account for the high computational cost.

“Quake II was already designed with many light sources when it was first released, in that sense it is still quite a modern game. Also, the number of light scattering events does not depend on scene complexity.”
https://www.techspot.com/news/78346-quake-ii-looks-better-than-ever-ray-tracing.html
 
Question regarding # of rays required in games. Imagination Tech stated for a for a fully ray traced game you would need to budget between 3 - 3.5 GRays at 1080p, 60fps. (counter 27:43) Based on some statements in the thread it sounds like that would be too few and wondered why their lower limit might not be an accurate approximation for use in games.
RT discussion begins at counter 18:00
 
Question regarding # of rays required in games. Imagination Tech stated for a for a fully ray traced game you would need to budget between 3 - 3.5 GRays at 1080p, 60fps. (counter 27:43) Based on some statements in the thread it sounds like that would be too few and wondered why their lower limit might not be an accurate approximation for use in games.
If i'm right that's 25 rays per pixel and frame? Much more than we see now, with BFV using less than one and Q2 using 4. But still nothing in comparison to offline CGI.
Seems they just sum up the numbers given in their chart, which is an example that does not match any of the current approaches using RTX very well, but it depends on your goals. (They would sacrifice most performance to support lens effects like DOF?)
But in any case that question always is 'What can we do with the hardware as is', not 'How much performance is required to be useful', so i could only dodge if you asked me what's necessary.

I like how they show how the HW works. I really like it! :)

When i saw this video:

... i wondered how they do denoising. Is this what the 'frame accumulator' is good for, so kind of naive HW solution? I think 2014 the idea of realtime denoising did not really exist yet? (An example of software progress that would not have happened with fixed function HW? ;) )
In that case their number of necessary rays would no longer be accurate now, considering how many rays clever denoising can save.

The game is quite bright for just one bounce - maybe they add a constant ambient term as well?
Nope - totally dark in the other video. I was confused by skylight.
 
... i wondered how they do denoising. Is this what the 'frame accumulator' is good for, so kind of naive HW solution? I think 2014 the idea of realtime denoising did not really exist yet? (An example of software progress that would not have happened with fixed function HW? ;) )
In that case their number of necessary rays would no longer be accurate now, considering how many rays clever denoising can save.
PowerVR's biggest problem with VR (which started already in 2010) was the fact that they had mobile chips, which brings it's own limitations
 
Question regarding # of rays required in games. Imagination Tech stated for a for a fully ray traced game you would need to budget between 3 - 3.5 GRays at 1080p, 60fps. (counter 27:43) Based on some statements in the thread it sounds like that would be too few and wondered why their lower limit might not be an accurate approximation for use in games.
RT discussion begins at counter 18:00
On a more serious note, what are they tracing for lights? Their demos look like one sunlight source, so a single ray-per-pixel lighting. Once you hit area lights, you need to case multiple rays per light source, the more the better for quality. 3 light sources at 5 samples per pixel is 15 light rays per pixel. Add in true GI, so trace all 15 rays for every sampled surface for bounce lighting, and you're up to 225 rays per pixel with quite a lot of noise...

The amount of rays you need is thus determined by what quality you are after, how many light sources, and what shortcuts and optimisations you can make. Still, 1080p60 is 124 million pixels per second. 3.5 gigarays is 25-30 rays per pixel, so not an insignificant number. Brute-force tracing everything just needs more rays than is feasible. Only sane option is caching results and tracing against this plus generating updates.
 
Would love to see results with longer ray paths.

It seems this would be very easy to do:
In this shader: https://github.com/cschied/q2vkpt/blob/master/src/refresh/vkpt/shader/path_tracer.h

change
#define NUM_BOUNCES 2
to a higher number

For non programmers (assuming the project compiles out of the box without issues) likely you'd need to install Microsoft Visual Studio, Vulkan SDK, download project from git, change and compile in release mode, figure out where the exe is and replace.
... but likely it's harder than it sounds.
Ooops, no, dump - shaders have it's own compiler, glslang.exe coming with VK SDK - there might be a bat file which executes this to recompile all shaders to *.spv, and maybe replacing those files in demo dir would work then... better NOT waste your time if you're not a programmer :|


In the same file there seems to be the code that stores aldebo in a screenbuffer. I would change this to a constant grey to enjoy beauty of GI without distracting textures :)
 
3 light sources at 5 samples per pixel is 15 light rays per pixel.
It would be possible to pick only one light per frame randomly and accumulate over time.
I blindly assumed this is the case in Q2 demo, but wanted to be sure so i checked above file. But i can't figure out quickly. If they iterate over all lights in the list, there would be much more rays per pixel than 4(!)
 
It seems this would be very easy to do:
In this shader: https://github.com/cschied/q2vkpt/blob/master/src/refresh/vkpt/shader/path_tracer.h

change
#define NUM_BOUNCES 2
to a higher number

For non programmers (assuming the project compiles out of the box without issues) likely you'd need to install Microsoft Visual Studio, Vulkan SDK, download project from git, change and compile in release mode, figure out where the exe is and replace.
... but likely it's harder than it sounds.
Ooops, no, dump - shaders have it's own compiler, glslang.exe coming with VK SDK - there might be a bat file which executes this to recompile all shaders to *.spv, and maybe replacing those files in demo dir would work then... better NOT waste your time if you're not a programmer :|


In the same file there seems to be the code that stores aldebo in a screenbuffer. I would change this to a constant grey to enjoy beauty of GI without distracting textures :)
Yup, not a programmer and currently no proper computer. (Old laptop with dx10 class GPU..)

Do love tweak and test things when relatively/very easy. (UE4 overscan screenspace reflections etc.)
 
Back
Top