Better color space

Nick

Veteran
Hi all,

A thread at GameDev.net got me thinking about using a 'better' color space for real-time graphics: Color Interpolation.

It seems to have some similarities to HDR, but I'm not sure if it's actually physically correct. Could this create less aliasing when used for texture filtering?
 
It is extremely desirable to have the three color components orthogonal to each other, so that they can be interpolated independently (unlike this algorithm). That doesn't mean they have to be RGB -- YUV (more properly YCrCb) also has three orthogonal components. I'm not sure what you get if you interpolate in YCrCb space, however.

Physical correctness is a trickier question. Computer graphics is filled with hacks that are accepted because they look ok, but over time (as processing power increases), we seem to have been getting closer to physical correctness. Processing colors (including AA blending) in linear space instead of in perceptual (gamma corrected) space is an example. More elaborate lighting and reflectance algorithms is another example.

On the other hand, our eyes do a lot of odd things to the light that hits them to produce what we see as reality. So I don't think that physical correctness, by itself, is enough -- there also needs to be a demonstration that the result looks better. The image presented in the link certainly looks better to me than RGB interpolation, but it would be a lot more practical if it were an orthogonal color space. And, of course, I don't know if I'm seeing the result properly gamma corrected -- there are so many places between that web site and my monitor where it could go wrong.

Enjoy, Aranfell
 
aranfell said:
It is extremely desirable to have the three color components orthogonal to each other, so that they can be interpolated independently (unlike this algorithm). That doesn't mean they have to be RGB -- YUV (more properly YCrCb) also has three orthogonal components. I'm not sure what you get if you interpolate in YCrCb space, however.
Linear interpolation in a linear Yab space should give the same result as linear interpolation in linear RGB space.

I don't think this normal-like interpolation is a good idea. It only looks ok if the output is nonlinear.
 
So just HDR with proper gamma correction should give the same results as the image on the right?

compare2.png
 
Nick said:
So just HDR with proper gamma correction should give the same results as the image on the right?
No, because the spherical interpolation used on the right might be approximately like gamma for some brightness value, but isn't identical.
And why would you need HDR for interpolation in the [0, 1] range?

To get the left image look properly, you just need to adjust your display gamma to linearize the output.
This reference image can help you with that. Adjust the gamma so that the dithered bottom area and the greyscale above have the same perceived brightness at the "linear" marker.
 
Xmas said:
And why would you need HDR for interpolation in the [0, 1] range?
Well I meant gamma correction before and after arithmetic color operations, which is often combined with HDR, no?
To get the left image look properly, you just need to adjust your display gamma to linearize the output.
Then you lose a lot of color resolution for the dark tones. The human eye is less sensitive to intensity differences between bright colors. It's exponential, just like the gamma curve. So adjusting the display isn't the correct solution (unless we would have HDR displays, i.e. more than 8-bit per component).

So I think we should really correct the color interpolation. :?
 
Nick said:
Well I meant gamma correction before and after arithmetic color operations, which is often combined with HDR, no?
Floating point textures or render targets should store color in linear color space. Shader arithmetic should be linear color space, too, so no conversion is needed.


Then you lose a lot of color resolution for the dark tones. The human eye is less sensitive to intensity differences between bright colors. It's exponential, just like the gamma curve. So adjusting the display isn't the correct solution (unless we would have HDR displays, i.e. more than 8-bit per component).

So I think we should really correct the color interpolation. :?
Well, raising the gamma value was just a suggestion how to make this image look properly.

Generally, what should be done is this:
- constant colors and vertex colors should be in linear color space (edit: as long as they're FP, that is)
- fixed point color textures should be "sRGB" (which in this context just means gamma 2.2)
- FP textures should be in linear color space
- the TMU should linearize fixed point sRGB texels before filtering
- all shader arithmetic, including interpolation, should be linear
- with a fixed point color rendertarget, frame buffer reads for blending should be ^2.2, frame buffer writes should be ^(1/2.2). Same for AA downsampling.
- with a FP color target, those operations should be linear
- on scanout, the gamma LUT should take care that the sRGB values in the fixed point framebuffer are linearized again
 
That color space is a not a good idea for almost any area graphics. It's no longer linear light. That means that it's not a field under addition. I don't think it is under multiplication either. Basically, it's not anywhere near physically correct, and all the normal operations you do on it don't work properly.

His entire motivation is flawed. Why is that dark region a bad thing? That's how light works. It's linear. It's probably a bad thing since he didn't bother to gamma correct his image before sending it to the monitor. So, the "bad"image is unnaturally dark. Chances are he thinks those dark colors are wasted, prefers saturation, and found some random math to "fix" it.

Things like this are confusing different problems. You want to store and process everything in a scene-referred format (something that reflects the real world, with linear light, know color primaries and all) and then gamma correct / tone map / etc... for whatever display medium you happen to be sitting by that moment.
 
Xmas said:
To get the left image look properly, you just need to adjust your display gamma to linearize the output.
This reference image can help you with that. Adjust the gamma so that the dithered bottom area and the greyscale above have the same perceived brightness at the "linear" marker.
My understanding it that it's actually not a good idea to use a pure alternating pixel pattern for these gamma setup images because of the low pass filtering inherent in the analog hardware etc. You are better off lowering the horizontal frequency. Obviously you can leave the vertical at full rate.
 
What Simon F says is (as usual) true. Except maybe for DVI-D on a TFT. (I haven't tried it.)

Here's some stuff that shows it quite clearly. In this zip (sorry for the popup), there's a file called tiltme_smiley2.bmp. Look at it in a viewer that can tilt the image 90º, tilt it, and get surprised. :D

That effect is exactly what Simon said, analog low pass filtering of a signal that isn't in linear format.
 
That makes sense. I edited my post above to show an 8x1 dither.

Although I have to say it works flawlessly on my TFT, whether digital or analog. And that smiley doesn't work as intended because of it.
 
Xmas - that new pattern is no better. It has to be a purely vertical dither.

Look at the AIM dither patterns I linked.

Jawed
 
I just tried the smiley on different resolutions. And it seem that on this comp (X300 => DVI-VGA converter => decent VGA cable => Hitachi CM828), the effect starts to show somewhere around 110MHz pixel clock.

There's still no visible bluring though, it can just be seen indirectly through this effect.
 
Jawed said:
Xmas - that new pattern is no better. It has to be a purely vertical dither.

Look at the AIM dither patterns I linked.

Jawed
I did look at them of course, but it doesn't have to be purely vertical IMO. Anyway, I didn't make this image for professional calibration. Originally it was just intended to show that 8bit per channel is not enough for linear color storage and will result in banding in dark regions, which is why those 0-50% and 0-25% gradients are there.
 
Xmas said:
Jawed said:
Xmas - that new pattern is no better. It has to be a purely vertical dither.

Look at the AIM dither patterns I linked.

Jawed
I did look at them of course, but it doesn't have to be purely vertical IMO.

No, but a purely vertical dither is pretty safe for CRT users, which is why it's the norm.

Jawed
 
Back
Top