Future of MSAA?

radar1200gs said:
Bla Bla..

On top of all that monitors age and their image deteriorates over time, requiring adjustment until adequate adjustment can no longer be provided and the monitor reaches the end of its effective life span.

bla bla.
OMG..nevermind
 
karlotta said:
radar1200gs said:
Bla Bla..

On top of all that monitors age and their image deteriorates over time, requiring adjustment until adequate adjustment can no longer be provided and the monitor reaches the end of its effective life span.

bla bla.
OMG..nevermind
The point of your post is???

CRT's and LCD's do deteriorate. The deterioration can be compensated for to a certain extent with recalibration, but you can never get back to the image quality of the new tube.
 
radar1200gs said:
ATi's hardwired solution takes none of that into account, so rather than blame the monitor, which at least allows adjustments, blame ATi for not providing any.

You should set your color correction settings to make your monitor match a gamma of 2.2. This way the AA would be correct since the samples are summed prior to the color correction stage.

Granted, it will still suck when your rendering stuff that's supposed to have a linear color space, but since you can't use AA with rendertargets anyway that isn't that much of an issue, now is it?
 
Well, vember, the problem with that is that monitors typically don't have a gamma setting. This is because, from what I understand, the gamma of a monitor is typically the result of the particular phosphor that is used. And I don't think changing the software gamma setting will change anything in this case.
 
radar1200gs said:
ATi's hardwired solution takes none of that into account, so rather than blame the monitor, which at least allows adjustments, blame ATi for not providing any.
Exactly what gamma ramp values can you adjust your monitor between? I would like to see the control that lets you set a gamma of 1.0, for example (which is what AA hardware without gamma adjustment assumes is the correct ramp for AA samples). My monitor certainly doesn't have such a control, and as far as I am aware has a gamma of somewhere above 2.0. I believe that a gamma of 2.2 is a generally accepted standard on the PC.

The non-linear response is a result of the properties of the phosphor itself, rather than any of the settings in the monitor. I don't think that most monitors provide any options that actually allow you to alter the voltage curve to adjust their gamma response, although I guess it would be possible to have a non-linear voltage ramp on the gun.

Even as the phosphor ages I don't think that the gamma curve on the CRT monitor alters all that much, certainly I don't think it's likely to approach 1.0.

I'm not sure what the response of most LCD panels is set to, however I don't believe that it is linear either.

[edit]
Found this link that has a chart to help you to estimate your monitor's gamma -

http://www.normankoren.com/makingfineprints1A.html#gammachart

My flat panel and CRT monitor both appear to be somewhere around 2.2

There is of course, also the gamma FAQ at

http://www.poynton.com/GammaFAQ.html
[/edit]
 
radar1200gs said:
CRT's and LCD's do deteriorate. The deterioration can be compensated for to a certain extent with recalibration, but you can never get back to the image quality of the new tube.
I don't believe that such recalibration will significantly affect the gamma response.
 
Wow, I guess it has been way too long since I have examined the Radeon's FSAA. My new monitor does apparently have a gamma of what appears to be just about 2.1-2.2. So technically I guess I should be seeing better FSAA in the Radeon shots, but I still don't see much difference. I suppose I'd need to take a good look at some synthetic tests again, where the aliasing differences are much more visible.
 
Chalnoth said:
Well, vember, the problem with that is that monitors typically don't have a gamma setting. This is because, from what I understand, the gamma of a monitor is typically the result of the particular phosphor that is used. And I don't think changing the software gamma setting will change anything in this case.

I was talking (a bit unclearly though) about the driver settings.

In the end, a game is just like any other application and should be able to assume (as should the GPU in the role as a graphics accellerator) that the output colorspace is gamma 2.2. It is then the job of the GPU (in the role as an output device) to do the color correction. If I'm not mistaken the color correction of the radeons are done just prior to the dac.
 
Well, what I'm saying is that it's most desireable to have an adjustable gamma, because despite the conventions (and, in fact, because of them: apparently the Macintosh's gamma is supposed to be 1.8, according to the link andypski posted), not all monitors have the same gamma. Or, potentially, that gamma may deteriorate over time, as radargs posted.

I know that my old monitor, for instance, was closer to a gamma of 1.8 than it was to 2.2. So, yes, it is the job of the video card to do the gamma correction on output, and all video cards have that setting, and it is adjustable.

The issue isn't for the output in this case. The issue is that if you want to be correct, any average that is made in color space should be done in the monitor's color space. The problem is that this will typically require the video card to, every time it does a color average (whether it be texture sampling, FSAA sample recombination, or whatever), it should first do an "inverse gamma adjustment," then average, then do the normal gamma adjustment. This is what ATI does on the FSAA sampling.

Now, another way to do this would be to get it done entirely in software: put all textures in linear color space before rendering. Then, all averaging that is done, right up to the FSAA sampling, will be correct. All you need after that is to use the video card's built-in gamma adjustment on output and you're done.

What are the problems with this approach? Well, put simply, precision suffers. The only real way around precision problems are to use floating point textures and framebuffers (you'd probably also need a higher-than-8 bits per color output buffer...Matrox' 10-10-10-2 buffer as a rendertarget for the tone mapping pass would be great here). The basic reason why precision suffers is that gamma adjustment, on one end of the color spectrum, maps a few values in linear space to many values in screen space. This causes banding. You can see this effect very easily by just turning the gamma way up in any game. I don't know whether using an 8-bit buffer and gamma-corrected source art would result in significant noticeable banding in most situations in today's games or not (though dark areas/games are always more susceptible).
 
Chalnoth said:
The issue isn't for the output in this case. The issue is that if you want to be correct, any average that is made in color space should be done in the monitor's color space. The problem is that this will typically require the video card to, every time it does a color average (whether it be texture sampling, FSAA sample recombination, or whatever), it should first do an "inverse gamma adjustment," then average, then do the normal gamma adjustment. This is what ATI does on the FSAA sampling.

No, averages should be done at a linear color-space (gamma 1.0). This what the AA does. It converts from gamma 2.2 to 1.0, does the averaging and converts it back to 2.2 again. The conversion is always the same regardless of the monitor gamma.

Think about it, a photograph of gamma 2.2 have no way of knowing the gamma of your monitor when it was taken yet the "antialiasing" in it is correct as long as your monitor is corrected with the help of the GPU.

You're right about using linear color space throughout though, it would make blends & etc look better and would not like the gamma corrected AA at all.
 
vember said:
No, averages should be done at a linear color-space (gamma 1.0).
Right, this is what I was talking about when I said the monitor's color space.

This what the AA does. It converts from gamma 2.2 to 1.0, does the averaging and converts it back to 2.2 again. The conversion is always the same regardless of the monitor gamma.
Actually, I think it's the other way around, but that's getting into the mathematical details and is relatively unimportant.

Think about it, a photograph of gamma 2.2 have no way of knowing the gamma of your monitor when it was taken yet the "antialiasing" in it is correct as long as your monitor is corrected with the help of the GPU.
I'm not sure this can be done through the gamma correction available. You'd have to give me a mathematical explanation (I don't have the time right now...class).

As a side comment, though, I'm at my office right now, and my monitor here has a gamma of about 2.4.
 
Chalnoth said:
You'd have to give me a mathematical explanation (I don't have the time right now...class).

sure, it's not that difficult really. the relation between linear colorspace and a gamma color space is x to the power of the gamma space: x^(1/gamma) where x = [0 .. 1]

and to convert between two gamma spaces you use: x^(old_gamma/new_gamma)

if the fragments for averaging for the aa is in gamma 2.2 space (this is up to the game content and blending though) then the aa averaging will be done on linear samples.

x = mean(samples^(2.2/1)) - the samples are converted to linear colorspace and averaged. notice that the averaging is done in the linear color space (the power is raised before the average is calculated)

from this linear color space (x), the AA-stage of the GPU will bring the averaged samples back to gamma 2.2 :

gpu_out = x^(1/2.2)

the monitor takes a signal of gamma 2.4 and and displays an image with gamma 1.0. thus:

monitor_out = gpu_out^(2.4/1) = x^(2.4/2.2)

that's quite close to the averaging being done in a linear color space already.

the right gamma correction for this monitor would be y = x^(2.2/2.4) thus:

gpu_out = (x^(1/2.2))^(2.2/2.4) = x^2.4
monitor_out = gpu_out^(2.4/1) = x

now the gamma of the monitor doesn't affect the aa averaging anymore since it is compensated for at the gpu output stage..
 
It seems to me, then, that if you do the gamma correction for this monitor, you won't be doing the averaging in the color space for the monitor, since the inverse gamma adjustment won't be done at 2.4 gamma.
 
andypski said:
I'm not sure what the response of most LCD panels is set to, however I don't believe that it is linear either.
My LCD at home seems to emulate ~2.2
 
Here is a link to adjust your gamma. Chalnoth do you mean the pics posted by radar1200gs? There is a noticable difference for me.
 
Chalnoth said:
It seems to me, then, that if you do the gamma correction for this monitor, you won't be doing the averaging in the color space for the monitor, since the inverse gamma adjustment won't be done at 2.4 gamma.

Neither are you supposed to. It IS supposed to be done at a linear (gamma 1.0) color space. The monitor is just a step in the chain for displaying an image. What the signal looks like at the input of the monitor is totally irrellevant, it's the image that is displayed on the screen, the light that reach your eyes, that matters. Your monitor doesn't care how the AA is done, but your eyes & brain do.
 
You're not understanding me, vember. When I say the color space of the monitor, I'm talking about the displayed color space, in other words, linear color space (well, sort of: what I really mean is the color space in which our eyes average the colors 1 + 2 to get 1.5).

Anyway, you haven't convinced me that the averaged color space is, with this averaging, going to look correct on a gamma 2.4 monitor.

But, as a side note, I'd like to comment that on many monitors that don't display gamma 2.2 with no gamma correction, it is often unfeasible to properly-set the gamma correction to compensate: it will frequently make the game either too dark or too light.
 
ok, I don't really understand why you're not getting it then.. :p

btw, I did a math typo above:

gpu_out = (x^(1/2.2))^(2.2/2.4) = x^2.4

is supposed to be:

gpu_out = (x^(1/2.2))^(2.2/2.4) = x^1/2.4
 
Here's what I'm not getting.

You have not argued completely that two color values, when averaged in the way you have prescribed, will look the same as an appropriate nearby mixture of colors when viewed from far away. I'm sure that it is possible to do this with very simple constructions.
 
Back
Top