But Splinter Cell: Chaos Theory uses FP16 textures and surfaces like the FP16 HDR mode in FarCry. The Postfilterprocessing is different between this two titles.
overclocked said:But Splinter Cell: Chaos Theory uses FP16 textures and surfaces like the FP16 HDR mode in FarCry. The Postfilterprocessing is different between this two titles.
That´s just in the pc-version right(it must be)?
Otherwise there´s going to be crazy in the consoleforum.
bigz said:Thanks, is the HDR format that is used in SC: CT known?
bigz said:I was referring to the fileformat used, but I am intregued as to the methods used to achieve HDR in SC: CT. It's certainly not the after thought that was the case with FarCry, to a certain extent. My logic here is that FarCry is quite often over-exposed.
However, this could be down to NVIDIA's drivers, as there is over exposure in some instances without HDR enabled.
That it's low-quality. 16-bit integer buffers have limited dynamic range, and can thus lead to banding or lack of proper luminance in high dynamic range scenes. By contrast, FP16 has effectively infinite dynamic range, for the purposes of color data.Subtlesnake said:Valve said they were using 16 bit integer buffers. What does this mean?
Essentially all of the guts of an HDR technique lies in tone mapping pass. Everything else is just determining how bright things are supposed to be.bigz said:I was referring to the fileformat used, but I am intregued as to the methods used to achieve HDR in SC: CT. It's certainly not the after thought that was the case with FarCry, to a certain extent. My logic here is that FarCry is quite often over-exposed.
No. The tone mapping pass is a shader entirely written by the software developer, and thus independent of IHV drivers (assuming that the driver is compiling the shader properly).However, this could be down to NVIDIA's drivers, as there is over exposure in some instances without HDR enabled.
No. That's just the tonemapper. The first realtime one you might have seen was http://www.daionet.gr.jp/~masa/rthdribl/ . It's also in a sample in the DirectX SDK.Subtlesnake said:"So what we’re doing in "Lost Coast" through the use of HDR is, depending on where you are in proximity to the light source and how long you’ve been looking at it, your eyes will adjust and the lighting in that world will adjust."
Is this new technology?
Actually, far from. The maximum value that fp16 can represent is only around 65,000. Though, this is bright enough for storage of color information. The format was designed so that it would be bright enough to cover the entire range of visible light intesities (the sun is roughly 50,000 cd/m2) while providing less quantization error than human vision can detect across the entire range. That said, it's not a movie-quality working format (the quantization resulting from several operations on fp16 values may have sufficient error quantization to be perceivable) but it works great for storage, and for games since they aren't nearly that precise yet.By contrast, FP16 has effectively infinite dynamic range, for the purposes of color data.
This actually works quite poorly in practice, in my opinion. You need frame coherency on these things. Your eyes can't change very quickly, so you need to clamp the rate of change on these. For games, you need to do this without reading back the value, so keeping the rate of change sufficiently small requires a bit more work. Taking the maximum is highly subject to outliers (which is why log average is generally taken over arithmetic average). Think about the sun flickering through trees and how much the exposure would change based on whether you could see it that frame or not. Would turn your monitor into a strobe. Like I said, they probably just tweaked the blow-out constant in the photographic tonemapper to pump up the effect some.One way you could hack a better modification of this technique would be to make the final brightness of the scene a function of the maximum brightness. This would have the effect of a bright point of light blinding you (which is fairly realistic), and would also reduce the white-out problem to essentially only appearing when your virtual eyes are adjusting.
squarewithin said:Actually, far from. The maximum value that fp16 can represent is only around 65,000. Though, this is bright enough for storage of color information.By contrast, FP16 has effectively infinite dynamic range, for the purposes of color data.
Yes, it's not as accurate as could be, but FP16 has quite enough dynamic range for color data. And it's more than accurate enough for most anything outputting to a final 8-bit format, particularly as a framebuffer format.for the purposes of color data
Well, I thought that was obvious.This actually works quite poorly in practice, in my opinion. You need frame coherency on these things. Your eyes can't change very quickly, so you need to clamp the rate of change on these. For games, you need to do this without reading back the value, so keeping the rate of change sufficiently small requires a bit more work.
Actually, I think taking the maximum would probably be more realistic. After all, just think about why the eyes limit the amount of light that enters: it's a defense mechanism to prevent damage. Damage can occur on a small part of the eye quite easily, and so it makes sense to limit based upon a maximum, though since the eye has a different number of receptors in different areas, different areas can take differing amounts of brightness, so it may be even more realistic to modulate the maximum by some function of distance from the center of the screen.Taking the maximum is highly subject to outliers (which is why log average is generally taken over arithmetic average). Think about the sun flickering through trees and how much the exposure would change based on whether you could see it that frame or not. Would turn your monitor into a strobe. Like I said, they probably just tweaked the blow-out constant in the photographic tonemapper to pump up the effect some.
It is true that your retina responds to the brightest light, not the average, but it does so locally. To properly use the max value, you have to look at a weighted neighborhood around that particular pixel to determine what it's exposure level is. The problem with this (besides the obvious fill limitations, because the neighborhood is quite large) is that you get reverse gradients, where the area around a bright point gets progressively darker as you get near it. This can be fixed through bilateral filtering, but that requires a whole set of images computed with different blur weightings and blending between them based on the intensity in your neighborhood. The faster implementations use FFTW and still take several seconds per frame. I am not aware of any methods that produce acceptable results using purely local (current pixel only) information. The log scene average works quite well. It's not right, but at least it's consistent, to paraphrase Heckbert. With a few tweaks, you could easily fix the over-brightening problem.Chalnoth said:Actually, I think taking the maximum would probably be more realistic. After all, just think about why the eyes limit the amount of light that enters: it's a defense mechanism to prevent damage. Damage can occur on a small part of the eye quite easily, and so it makes sense to limit based upon a maximum, though since the eye has a different number of receptors in different areas, different areas can take differing amounts of brightness, so it may be even more realistic to modulate the maximum by some function of distance from the center of the screen.
I'm just not sure how that's possible. Light is limited by contraction of the iris. How is that local?squarewithin said:It is true that your retina responds to the brightest light, not the average, but it does so locally.
It can limit light somewhat, but it only performs a small part of the light adaptation that your visual system can handle. I think the pupil can change by a factor of 16 in area. That's only 4 stops of the 34 or so stops that our visual system can resolve across the full range of adaptation.Chalnoth said:I'm just not sure how that's possible. Light is limited by contraction of the iris. How is that local?