Recently I've doing a few experiments with indirect lighting.
Initially, this started as a per-vertex occlusion baker.
To do the occlusion I am using some rather funky projection magic, getting the GPU to render a view from the surface of each triangle (with a near 180 degree FOV). Where the background was 1, and all geometry was 0. Taking the properly weighted average to get the linear occlusion term for that triangle, etc.
This was fairly simple, but worked surprisingly well; So I started to experiment.
Starting with an occlusion value for linear light...
An easy extension was to generate the average direction of this incoming light. This is a worldspace vector, but the interesting thing is the length of the vector represents how 'diffuse' the incoming light is. If properly averaged, a length of 1 represents light coming from a single direction, while a length of 2/3 represents light coming evenly over the entire hemisphere.
This vector isn't especially useful during normal rendering (perhaps ambient specular modification?).
However, it got me thinking about reusing this data in a second pass;
So more experiments!
I setup a second pass to render the per-triangle step again - this time the projection wouldn't display a black model on a white background - it'd display the results of the first pass (average incoming light direction) and the albedo of the model. This is (once again) averaged up.
Before you know it, I had the average direction, colour, intensity and 'diffuseness' of 1st pass bounce lighting for that triangle.
So I did what you naturally do next; I reversed the projection. I did the exact same thing for 'subsurface' light. Applying an exponential distance falloff, I then had a nice approximation to subsurface scattering.
So, basically; I've been messing around with a tool that generates per-vertex linear occlusion, indirect bounce lighting colour+direction and scatter colour+direction for a model. And it works surprisingly well, and is really cheap at runtime (but rather slow to compute .
^ The ambient occlusion term + the indirect terms
It comes down to 15b of data per-vertex;
2x sbyte4 direction vectors (xyzw)
2x byte3 colour values
1x linear occlusion
And the runtime cost is bugger all;
Something like this in a vertex program:
Basically, I've found it's more costly getting the various directions into the same space than actually computing the contribution.
Pretty simple really. While it's certainly no where near accurate, it's 'good enough' because as it's so subtle. Any errors (and there are many) are hidden by the general ambient / direct lighting intensity (provided you are gamma correcting / tone mapping - and actually have ambient light ).
It's good enough that even dramatic animation doesn't really matter much. It still looks plausible.
I'm currently working on sorting out a proper computation tool. (Instead of the hacked up mess I have now )
Anywho. Thought I'd share
Initially, this started as a per-vertex occlusion baker.
To do the occlusion I am using some rather funky projection magic, getting the GPU to render a view from the surface of each triangle (with a near 180 degree FOV). Where the background was 1, and all geometry was 0. Taking the properly weighted average to get the linear occlusion term for that triangle, etc.
This was fairly simple, but worked surprisingly well; So I started to experiment.
Starting with an occlusion value for linear light...
An easy extension was to generate the average direction of this incoming light. This is a worldspace vector, but the interesting thing is the length of the vector represents how 'diffuse' the incoming light is. If properly averaged, a length of 1 represents light coming from a single direction, while a length of 2/3 represents light coming evenly over the entire hemisphere.
This vector isn't especially useful during normal rendering (perhaps ambient specular modification?).
However, it got me thinking about reusing this data in a second pass;
So more experiments!
I setup a second pass to render the per-triangle step again - this time the projection wouldn't display a black model on a white background - it'd display the results of the first pass (average incoming light direction) and the albedo of the model. This is (once again) averaged up.
Before you know it, I had the average direction, colour, intensity and 'diffuseness' of 1st pass bounce lighting for that triangle.
So I did what you naturally do next; I reversed the projection. I did the exact same thing for 'subsurface' light. Applying an exponential distance falloff, I then had a nice approximation to subsurface scattering.
So, basically; I've been messing around with a tool that generates per-vertex linear occlusion, indirect bounce lighting colour+direction and scatter colour+direction for a model. And it works surprisingly well, and is really cheap at runtime (but rather slow to compute .
^ The ambient occlusion term + the indirect terms
It comes down to 15b of data per-vertex;
2x sbyte4 direction vectors (xyzw)
2x byte3 colour values
1x linear occlusion
And the runtime cost is bugger all;
Something like this in a vertex program:
Code:
float3 gComputeIndirectLight(float4 indirectDirection, float3 indirectColour, float4 scatterDirection, float3 scatterColour, float3 lightDirection, float3 lightColour)
{
float3 light = max(dot(indirectDirection, float4(lightDirection, 1), 0) * indirectColour;
light += max(dot(scatterDirection, float4(lightDirection, 1), 0) * scatterColour;
return light * lightColour;
}
Basically, I've found it's more costly getting the various directions into the same space than actually computing the contribution.
Pretty simple really. While it's certainly no where near accurate, it's 'good enough' because as it's so subtle. Any errors (and there are many) are hidden by the general ambient / direct lighting intensity (provided you are gamma correcting / tone mapping - and actually have ambient light ).
It's good enough that even dramatic animation doesn't really matter much. It still looks plausible.
I'm currently working on sorting out a proper computation tool. (Instead of the hacked up mess I have now )
Anywho. Thought I'd share