Future of MSAA?

Precisely. It's with this kind of stuff in mind:
But there are two other things to consider.
- First, a typical display device doesn't have a linear response curve, i.e. if you double the intensity value (voltage for analog transmission), you get more than double the photons. Usually, that relation can be approximated as
Luminance ~ signal^gamma
with signal being in the [0,1] range and gamma typically being about 2.

- And second, our perception isn't linear, but approximately logarithmic. The ratio between just noticeable difference and luminance doesn't change much. This has a big impact on required precision, in that we need much more values representing the darker colors.
....that I'm having a hard time with vember's posts.

Anyway, I'm not sure I can really agree that the answer is to do the gamma correction at each and every step. Makes more sense to me to just do all operations in linear space and leave the correction to the end (less hardware is required that way). The only potential problem is that this does depend upon game developers making the switch.

Now, lastly, the thing that really bothers me about this whole issue is that if I simply take the situation of a white line on a black border, why should the gamma setting of 2.2 be the proper setting to make that line look correct on any monitor if the "proper" gamma correction setting is used?
 
"Anyway, I'm not sure I can really agree that the answer is to do the gamma correction at each and every step. Makes more sense to me to just do all operations in linear space and leave the correction to the end (less hardware is required that way). The only potential problem is that this does depend upon game developers making the switch. "

Yes that's better in everyway. But that really requires better than 8-bit presicion. Otherwise the banding in the dark areas of the image is nasty.

"Now, lastly, the thing that really bothers me about this whole issue is that if I simply take the situation of a white line on a black border, why should the gamma setting of 2.2 be the proper setting to make that line look correct on any monitor if the "proper" gamma correction setting is used?"

Without AA. No reason.

With AA. Then the AA averaging should be made with a proper gamma for the monitor in mind.

But the AA could also be made for any arbitrary gamma (including 1.0 & 2.2) and then corrected to the gamma of the monitor at the output stage. It's just an intermediary step, and if that interemediary step uses gamma 1.0 (which you don't seem to mind) or gamma 2.2 doesn't really matter from a correctness point of view as long as all image operations are done with this intermediary gamma in mind.

I'm not saying that the ATi AA-way is the best or only way to do this, but it does work and it does provide AA with gamma taken in mind, regardless of the monitor gamma.
 
It appears that two different usages of gamma correction are getting intertwined in this discussion.

First, there is the gamma correction required by the monitor. I think most (if not all) hardware is able to adjust the monitor response curve via the lookup tables in the RAMDAC. (Whether it is easy or possible to adjust this via the API is a separate isssue.) Different monitors require different response curves. Flat panels are a whole separate issue -- IFIRC, their response curves are not describable with an exponential curve.

Second, there is the gamma used to store color information the frame buffer. This gamma doesn't need to have any relationship to the monitor gamma., e.g. it can be sRGB standard (gamma 2.2 or "perceptual light"), linear (gamma 1.0 or "physical light"), or anything else. It doesn't even need to be an exponential curve, and in fact the sRGB standard isn't exactly an exponential curve -- they gave it a linear ramp near zero to bound the slope of the curve. There are several standards besides sRGB.

There are only two reasons to store non-linear (gamma-corected) color in the frame buffer. One reason is to approximately match the monitor gamma, so that a simple implementation that doesn't adjust for the monitor gamma in the RAMDAC LUTs will still look good. The sRGB standard was chosen to look good on typical monitors under typical lighting conditions (to summarize a long explanation that I once read). It certainly doesn't look good on all monitors or with all lighting conditions, but that is what LUT-correction is for.

The other reason to store gamma-corrected values in the frame buffer is for compression. With sRGB gamma correction, 8-bits per color component are enough to mostly eliminate banding artifacts on a color ramp. With linear color at least 10-bits are needed to eliminate banding. Making the most of the encoding requires that incremental steps in the stored color approximately match incremental steps in our ability to perceive color differences, hence the term "perceptual light", as opposed to linear or "physical light", which measures the number of photons.

Now here's the point: small changes in the gamma used to store data in the frame buffer don't significantly affect the compression efficiency. So there is no need to support multiple different gamma curves for storing colors in the frame buffer, *provided* that one corrects that to the actual monitor gamma for display. Supporting just one non-linear choice, e.g. sRGB, gets all the color-compression benefit that storing gamma-corrected color is likely to get. Granted, this doesn't help if you are playing a game where the color was sampled using some other gamma curve, so that the colors in the frame buffer aren't sRGB.

However one stores data in the frame buffer, doing the AA blend (and indeed, all blending) in linear space produces better results than blending in any non-linear color space. That's because blending is a physical operation. For example, covering half of a light source reduces the number of photons by one-half, though due to the eye's logarithmic response curve, it doesn't reduce the perceived light level by 1/2. (IIRC, it requires covering 86% of the light source to make it appear half as bright). Doing the AA blending in linear space produces a more physically correct (and more importantly, better looking) result, as I've verified by side-by-side comparisons (sorry, I don't have images or animations to post).

Conclusion: gamma-corrected AA only needs to support the gamma curve(s) that are used to store data in the frame buffer. It doesn't need to support the gamma curves of the monitors. Therefore, the significantly higher cost of building in programmable gamma-adjustment for AA or blending is of limited benefit, *provided* that the industry can settle on a single gamma-curve for frame buffer data.

Enjoy, Aranfell
 
aranfell said:
Now here's the point: small changes in the gamma used to store data in the frame buffer don't significantly affect the compression efficiency. So there is no need to support multiple different gamma curves for storing colors in the frame buffer, *provided* that one corrects that to the actual monitor gamma for display. Supporting just one non-linear choice, e.g. sRGB, gets all the color-compression benefit that storing gamma-corrected color is likely to get. Granted, this doesn't help if you are playing a game where the color was sampled using some other gamma curve, so that the colors in the frame buffer aren't sRGB.
A well written most. It even agreed with my understanding of how things work and brought up some new points as well. I was unaware of the info in the above quote. I assume someone proved this with real data.
 
aranfell said:
Second, there is the gamma used to store color information the frame buffer. This gamma doesn't need to have any relationship to the monitor gamma., e.g. it can be sRGB standard (gamma 2.2 or "perceptual light"), linear (gamma 1.0 or "physical light"), or anything else. It doesn't even need to be an exponential curve, and in fact the sRGB standard isn't exactly an exponential curve -- they gave it a linear ramp near zero to bound the slope of the curve. There are several standards besides sRGB.
This finally cuts to the heart of the issue: the definition of colors on the PC. This is what you need to pay attention to when doing any sort of operations on the framebuffer.

I do think that, moving forward, we should have proper gamma correction along with fully HDR rendering by virtue of using FP framebuffers and defining the color space to be linear throughout rendering, until the final tonemapping pass.
 
3dcgi said:
aranfell said:
small changes in the gamma used to store data in the frame buffer don't significantly affect the compression efficiency.
I assume someone proved this with real data.

Well, yes and no. It has been thoroughly proven that encoding by perceptual steps instead of linear steps dramatically reduces banding artifacts and so forth. Treating sRGB as a compression method, the quality of the compression depends on how close the sRGB encoding steps match the perceptual steps. Poynton claims that sRGB does well at encoding perceptual steps, though there are other opinions (as earlier posters noted). The issue is complicated because the result depends on the brightness of the display and the brightness of the ambient lighting, as well as varying in odd ways across the range. Or so I've read.

I don't have proof for my assertion that the exact gamma value doesn't alter the compression very much. I should have made it clear that I was referring to compression quality relative to a linear encoding, not compression quality relative to other gamma curves. I base that assertion on looking at graphs of different gamma functions. They don't differ all that much when compared to a linear ramp, hence my assumption that the visual differences among them will be small compared to using a linear encoding -- assuming that all of them are correctly converted to the monitor gamma for display, of course.

The other problem, as Chalnoth noted, is that a [0..1] intensity range does a poor job of rendering any image that has a large contrast ratio, such as any secene with bright sunlight, even if only glimpsed through a window or reflected off something shiny. Using floating point provides a larger range and builds in a step-wise gamma of 2.0 at the points where the exponent changes. That isn't ideal, but again, it is a lot better quality than even extended range linear (if one can afford enough bits to store floating point).

There was an HDR display demo at SIGGRAPH last summer that was REALLY impressive. Comparing a forest/river scene on an ordinary monitor (I think with a 200:1 contrast ratio) to one on their HDR display (with I think 40,000:1 contrast ratio) was like, well, comparing a picture to a picture-window. But I'm getting off-topic.

Enjoy, Aranfell
 
Back
Top