Basically, yes. The eye is non-linear (and, conveniently, so is a CRT in a compatible way) and so to get the best bang-for-the-bit, non-linear is the best approach. It's very similar to the A-Law (U-Law?) encoding used for Audio - the ear is also non-linear.psurge said:So the reason for not pre-baking the gamma into the actual texture is because 8 bits per component is too imprecise to hold color values in linear space without banding artifacts?
Given no internal HW for transforming from NonLin to Lin and back again, it makes sense to have the input texture in the same mode as the framebuffer. Of course, assuming the lighting is linear, when the textures and frame buffer aren't, means that it's incorrect but not horrendously so. It just results in pixels being a bit darker than they should be (except for the extremes, of course).
I don't know about "expensive" but it's not exactly cheap:Question - why is hardware support for this so expensive (even if the gamma value on read/write is
programmeable)?. For instance when a texel is read, don't you just need one access into 256 (maybe 1024) entry lookup table per color component?
For a start, if you are doing a single bilinear per clock, that's 4x3 (RGB) conversions (i.e. 12 copies of the hardware). Expand that to, say, 4 texture pipelines and it gets worse. On output you also need about 12bits per component.
Then you should also do all the multiplies (eg shading and blending) at higher precision, and the "intermediate" image framebuffer would also have to be expanded from 8:8:8:8 to at least 8:12:12:12 before doing the framebuffer blending. Finally, you convert to non-linear when storing the result back in the framebuffer.
Perhaps it is not a massive number of gates but I think consumer IHVs have had other priorities. Workstations, OTOH, have had wider framebuffers for some time, eg, SGI, so they could do calculations in linear space.
Do you mean with antialiasing or without? Without AA, a bright white dot would be exactly the same.Galilee said:Just a quick question: (probably very dumb, but I'm no expert in this area).
If a line or something is supposed to be white. Bright white, will not gamme correction in many cases change that color to grey? Is it then the way the designer wanted it to be?
(in those stars, the corrected gamma is always less bright, and more gray than the non-corrected).
With 2x2 AA, a 1/4 pixel dot would appear brighter with gamma correction than without.