Is all you need for HDR a floating point rendering buffer?

You need a fairly hefty piece of hardware to handle true HDR at respectable rates...

for a 1024x768 display, you can easily chew up ~20mb of video memory just to store the image you see on the screen... add up all your geometry and textures and you're quickly gonna max out a 128mb card.

Also, it's not as simple as just changing the frame buffer to "HDR mode" - the final image is still LDR, its just had an HDR texture mapped down to LDR (often with some post-processing lens effects).

- Take your original double frame buffer, 1024x768x32bit.
- Now create another floating point buffer, @ 128 or 64 bit
- Now create another floating point buffer @ 1/4 size (e.g. 512x384)

Render the entire scene to the HDR target, using bright light sources and nice shiny stuff that looks really cool (©).

Now, render the HDR values to the 1/4 sized texture, and filter using your bloom/star blending effect (read: needs a couple of pixel shader passes).

Add the 1/4 sized texture back onto the original full-size texture. Luminance sample this to get a general idea of brightness.

Then write the HDR values divided by the luminance to the LDR (original) texture.

Present it to the screen...

You might find the following couple of articles (esp. the diagrams :D) interesting:
High Dynamic Range Rendering [Anirudh S Shastry]
Description of HDRLighting demo from the MS-DX9 SDK

bottom line - conceptually simple but lots of data, lots of operations, lots of power needed

hth
Jack
 
Back
Top