I have been tuning tone mapping for our fortcoming game lately. It's the first game I have deloped that has real tone mapping. In earlier games we have just used 2x or 4x dynamic range to get better bloom quality and a very simple linear tonemapping (to get some kind of eye iris simulation).
I have encountered some problems with the tonemapping algorithm I use, mostly color desaturation and losing the constrast of the dark areas and bright areas.
1. I calculate the average screen brightness like this:
For each pixel:
pixel.brightness = log(pixel.r * 0.2125 + pixel.g * 0.7154 + pixel.b * 0.0721);
Calculate average by downsampling (blending 4x4 pixels together) until the final result is one pixel:
averageBrightness = exp(pixel1x1Brightness);
2. Iris closing and opening is set to 1% per frame (60 fps):
interpolatedBrightness = averageBrightness * 0.01 + interpolatedBrightness * 0.99;
3. The tonemap multiplier is calculated like this:
middleGray = 0.12;
tonemapMultiplier = middleGray / (interpolatedBrightness + 0.001)
4. Final pixel color is tonemapped like this:
pixelColor.rgb *= tonemapMultiplier;
pixelColor.rgb /= (1.0 + pixelColor.rgb);
The "pixelColor.rgb /= (1.0 + pixelColor.rgb)" converts the color to [0,1] range. With this operation black colors stay black, pure white (1.0) colors become 0.5, double white colors (2.0) become 0.666, triple white (3.0) become 0.75, and infinite bright pixels become near 1.0.
The good thing about this conversion is that no color values are clamped. However the contrast and color saturation of the image is reduced and the full bit range is not used optimally (bright values are very rarely used). The resulting image looks pretty damp compared to non-tonemapped and linear tonemapped (clamped) versions.
I have noticed that this kind of color compression to [0,1] range is pretty much standard (in all the whitepapers I have read about tonemapping). Is there any better way to do the range compression? Or should I maybe implement some kind of contrast and saturation enhancement filters to combat this effect? What kind of solutions have you implemented on your games and applications?
I have encountered some problems with the tonemapping algorithm I use, mostly color desaturation and losing the constrast of the dark areas and bright areas.
1. I calculate the average screen brightness like this:
For each pixel:
pixel.brightness = log(pixel.r * 0.2125 + pixel.g * 0.7154 + pixel.b * 0.0721);
Calculate average by downsampling (blending 4x4 pixels together) until the final result is one pixel:
averageBrightness = exp(pixel1x1Brightness);
2. Iris closing and opening is set to 1% per frame (60 fps):
interpolatedBrightness = averageBrightness * 0.01 + interpolatedBrightness * 0.99;
3. The tonemap multiplier is calculated like this:
middleGray = 0.12;
tonemapMultiplier = middleGray / (interpolatedBrightness + 0.001)
4. Final pixel color is tonemapped like this:
pixelColor.rgb *= tonemapMultiplier;
pixelColor.rgb /= (1.0 + pixelColor.rgb);
The "pixelColor.rgb /= (1.0 + pixelColor.rgb)" converts the color to [0,1] range. With this operation black colors stay black, pure white (1.0) colors become 0.5, double white colors (2.0) become 0.666, triple white (3.0) become 0.75, and infinite bright pixels become near 1.0.
The good thing about this conversion is that no color values are clamped. However the contrast and color saturation of the image is reduced and the full bit range is not used optimally (bright values are very rarely used). The resulting image looks pretty damp compared to non-tonemapped and linear tonemapped (clamped) versions.
I have noticed that this kind of color compression to [0,1] range is pretty much standard (in all the whitepapers I have read about tonemapping). Is there any better way to do the range compression? Or should I maybe implement some kind of contrast and saturation enhancement filters to combat this effect? What kind of solutions have you implemented on your games and applications?