Some new interesting information about valve's HDR implementation

The starting point was that there's no need to simulate the awful 5/6 stops dynamic range of low-end digicams in the HDR effects of games, when, for a variety of reasons, the average person is already familiar with a far more temperate rendering of real-world images by prints/monitors/TV screens.

The non-linearity in the lowest and highest zones of neg film, the toes on the s-curve as it were, is a pleasing-looking compromise between the real-world and a print.

The HDR techniques in games seem to ignore decades of photography and the concept of a pleasing look. That's what gets me upset.

Martin Parr's colour is fun, but I really don't want games to take on the same aesthetic.

http://www.martinparr.com/


Jawed
 
wireframe said:
PS. What are the odds of the rest of us getting our greedy little hands on your demo?

Thanks for positive the comments :smile:

As for a demo, two things need to change first: I need to finish it ;), and VS2005 needs to be released, as you are not allowed to redistribute code made with the beta.

Both arn't happening all that soon unfortunatly (cursed work!)
 
Perhaps I'm daft or it's late on a Friday afternoon and my brain is dead, but I couldn't find where they stated which drivers they used.

Anyone know?

Thanks.
 
anandtech have an article on values implementation of HDR today.

Some interesting things are mentioned,

reading between the lines, it would seem that they never render to a floating point/HDR render target, them simply tack on tone-mapping to all their pixel shaders, then do a post process on the rendered output to get the next frames tone mapping value. Maybe I've interpereted this wrong??, but thats what I got. Makes sense, but ultimatly is cheating.

So, rather than carry HDR data through the entire pipeline and all art assets, Valve made a different choice that gives a good balance of performance and HDR characteristics. Data is represented in fp16 or integer 4.12 linear space through in light sources cube maps and static lighting data. This method is unable to store overbright information directly, but Valve is still able to add a blooming shader. Our understanding is that this method eliminates the possibility for transmissive or refracted overbright data (we won't be able to see a bloom inside a stained glass window or from sand under water through which light has passed). But blooming light sources and direct light reflection is still possible and well used in the Source engine.

This image in perticular surprised me:

bloom.png


Note, the blooming does not occur for values above 1.0. So that would suggest they are blooming above 0.5 ! . frankly a very ugly and hacky way to do it :(
They also seem to suggest the light maps/cube maps are not stored as floating point textures on ati (integer), but not completly sure on this. Over all it's a bit dissapointing.

The geforce 6600 gets absolutly thumped in the tests too... Ohh well...


[edit]


using a combination of paintshop pro and paint.net I managed to do the following the first image:

subtract 0.5,
blur a bit
multiply original by 1-blurred image
add blurred image * 2

It looked _exactly_ the same as image number two above.
The sad thing, initally, I did the *2 after the blur, and it looks much better that way... *sigh*
 
Last edited by a moderator:
Graham said:
Makes sense, but ultimatly is cheating.

But isn't that the whole base of 3D rendering? Find ways to "cheat"? And I dont mean driver cheats, ect. I mean use approx that allow you to get really close to the final target but using a much easier road?

BTW Anand did also re-confim that the R520 can do AA using FP Render Targets:

Unfortunately, current hardware isn't able to handle full floating point data as fast as other methods, and no hardware (that is currently out) can allow MSAA to run on a floating point render target.

:)
 
jb said:
But isn't that the whole base of 3D rendering? Find ways to "cheat"? And I dont mean driver cheats, ect. I mean use approx that allow you to get really close to the final target but using a much easier road?
Yes, and this would have been a good HDR solution two years ago, when we had no parts that could blend with FP16 rendertargets. But we've had hardware that could do good HDR for a year and a half now. And now Valve is just releasing their first HDR implementation for SM2 hardware when ATI is on the verge of releasing SM3 hardware? :rolleyes:
 
Chalnoth said:
Yes, and this would have been a good HDR solution two years ago, when we had no parts that could blend with FP16 rendertargets. But we've had hardware that could do good HDR for a year and a half now. And now Valve is just releasing their first HDR implementation for SM2 hardware when ATI is on the verge of releasing SM3 hardware? :rolleyes:

Wow, not sure how to interrupt this. First of all good HDR was able to be done on the R300 cards. I have yet to see any screen shots to show otherwise. If you have them please I would love to see them. I am so sick of hearing people say that FP is needed for HDR. ITS NOT!!!

In another write up value showed 4 different possible ways to implement HDR into soruce. There were good and bad points in each method. And they chose this method as it seem to have the more good and less bad than other methods. Then if you look at the Steam stats on their user base video cards, most users are not using hardware that can use FP to do HDR so it only make sense to do this method as will work with a very large amount of their current user base....
 
And yet floating point render targets are far and away the best way to do HDR. They have the best quality/performance ratio.
 
Chalnoth said:
And yet floating point render targets are far and away the best way to do HDR. They have the best quality/performance ratio.

That I agree that the method is the BEST. I just have not see how much "better" it is in a real game yet. And that current hardware can not run that at acceptable speeds with AA. So its kind of a trade off..
 
What I'm most dissapointed about is not so much the fact they are not doing full HDR rendering (they appear to be doing HDR computation, just like every other shader based game out there), I'm more dissapointed by the blooming and how they have promoted this.
The primary benifit of HDR rendering is much higher accuracy in the final rendered image. this advantage is lost here. If the engine were not using lightmaps we would probably be seeing far more banding in the screenshots I feel.
 
Apple740 said:
When AA is available and playable with FP, then yes. Till then, certainly not.
Well, I finally went back and read up on Valve's implementation from the Anandtech article, and it looks like you won't be able to play with AA anyway on nVidia hardware.

Basically, it looks like with the NV4x, Valve's HDR will indeed use an FP16 rendertarget, but still uses all of the hacks for lightmaps and other content that are used for ATI hardware. Thus it's rather likely that Valve's implementation of HDR will be significantly slower than it should be for nVidia hardware, and possibly slower even for ATI's own SM3 hardware (depending on support for FP16 filtering).
 
Chalnoth said:
Well, I finally went back and read up on Valve's implementation from the Anandtech article, and it looks like you won't be able to play with AA anyway on nVidia hardware.

Basically, it looks like with the NV4x, Valve's HDR will indeed use an FP16 rendertarget, but still uses all of the hacks for lightmaps and other content that are used for ATI hardware. Thus it's rather likely that Valve's implementation of HDR will be significantly slower than it should be for nVidia hardware, and possibly slower even for ATI's own SM3 hardware (depending on support for FP16 filtering).

Why not just run all cards on the R420 path then?
 
Chalnoth said:
nVidia hardware doesn't support 16-bit integer rendertargets.

Any particular reason why they don't?
 
Back
Top