X1800/7800gt AA comparisons

Status
Not open for further replies.
DemoCoder said:
And you would be wrong and I say that as an LCD lover. LCD CR specs are some of the most outrageous bullshit. On/Off CR has nothing to do with IQ contrast response. It's the gamma ramp that determines it. Boot up an calibration disk like Avia on a CRT and on a LCD. Calibrate both. Now look at the gamma ramp distribution. The LCD will not look as good as the low and hire IRE values. Maybe I misspoke in the original post "superior contrast" is what I meant. Measuring contrast by measuring On/Off really tells you almost nothing, except what's the brightest white and darkest black, and even then, ANSI CR is a better measure.

CRTs definately have superior contrast. Black levels absolutely suck on LCDs.
 
OpenGL guy said:
4x AA can't do miracles, you know! The result is the best that can be expected with 4 samples. Try 6x and you'll get better results.
I am aware of that. My point is that in this particular case ATI's AA does a much worse job than Nvidia's supposedly inferior AA. I'd rather have visible jaggies than a dotted line.
 
Not sure whether it's to do with the funky Gamma settings of my 19" CRT monitor, but there is no comparison between those images posted in page one of this thread for me.

The 7800GTX image looks awful - so jaggy it's almost as if there is no AA applied at all whereas the ATI image looks pretty good. Can't really see the dotted lines unless except for the zoomed in image. Not sure how either would look in motion, of course.

Comparing image quality is going to be almost impossible when we have such a wide range of settings on our monitors and different display technology from monitor to monitor. Colour matching is a nightmare at the best of times without such a huge range of variables.
 
Man, this thread is ridiculous.

There is only one reason NVidia doesn't have the grey-black alternation:
The lines slant down-right

Look at the sample patterns for ATI and NVidia:
ATI:
aa_4x.jpg


NVidia:
4x.gif


They're mirror images of each other. If the lines were down-left, NVidia would have the chain-link look.

In any case, this shows exactly why gamma-corrected AA is so important. The NVidia AA looks like crap to me for the section in question. Even at 300% zoom, where I know CRT sharpness is a non-factor, it looks much better.

I wish I knew why LCD's have such a weird gamma ramp. I don't think it would be very hard to correct for it. They always looked unnatural to me, and this must be why. From the sounds of it, they like to crush whites in order to increase percieved brightness.

EDIT: Just played with PhotoShop, and I can totally see why LCD users see the "chain-link fence". God, I hate it when manufacturers play to the lowest common denominator. You see it with LCD monitors, projection TV's (LCD,DLP,LCOS), audio, and pretty much everywhere.
 
Last edited by a moderator:
Jawed said:
To ensure that the gamma is correctly calibrated, check the gamma 2.2 patch here:

http://www.aim-dtp.net/aim/evaluation/gamma_space/index.htm
Jawed, looking closer at the pattern, it's easy to see why weird S-shaped gamma curve would be rather undetectable.

If the striped side used solely black and white, LCD users could see the problem. But becaue they use their own faulty grey as a reference, it'll be hard to see the difference. L233, I'm sure your LCD doesn't follow the 2.0 curve for the top 1/3 of the gradient, at least if you could compare it with a CRT or piece of paper.

Most P&S digital cameras do the same thing as these LCD's to give visually attractive results. They have an S-curve that jacks up the contrast, and makes objects pop out more.
 
Mintmaster said:
Jawed, looking closer at the pattern, it's easy to see why weird S-shaped gamma curve would be rather undetectable.

If the striped side used solely black and white, LCD users could see the problem.
I'm confused. The left-hand greyscale is purely monochrome. So there's a reference.

But becaue they use their own faulty grey as a reference, it'll be hard to see the difference. L233, I'm sure your LCD doesn't follow the 2.0 curve for the top 1/3 of the gradient, at least if you could compare it with a CRT or piece of paper.
I agree this is quite likely. All the steps on the greyscale at the bottom of this page:

http://www.dpreview.com/reviews/nikond50/

should be discernible.

Most P&S digital cameras do the same thing as these LCD's to give visually attractive results. They have an S-curve that jacks up the contrast, and makes objects pop out more.
Yes - I run my digicam at its lowest contrast setting in a bid to avoid these excesses.

Jawed
 
Mariner said:
Not sure whether it's to do with the funky Gamma settings of my 19" CRT monitor, but there is no comparison between those images posted in page one of this thread for me.

The 7800GTX image looks awful - so jaggy it's almost as if there is no AA applied at all whereas the ATI image looks pretty good. Can't really see the dotted lines unless except for the zoomed in image.

That's the issue, if you watch the image on something plasma/lcd you'll see a checkerboard line on the ati card, don't worry the 7800GTX's AA still looks poop at 2x and 4x...
 
"Gamma correct" AA downsampling and display gamma correction should be two completely independent concepts.
 
Jawed said:
I'm confused. The left-hand greyscale is purely monochrome. So there's a reference.
If by monochrome you mean greyscale, then yes. Look at it closely, and you'll see that the stripes are not purely black/white, but rather white/grey or black/grey. Someone should make a chart with black/white/50%-grey 3-colour dithering only, and make both a 2x2 pixel grid for CRT users and 1x1 for LCD users.

The current chart is good for finding a best fit gamma value, but that doesn't mean it's a proper ramp. A nasty S-curve will still have one ramp that's closest. However, by using a compromised greyscale in the white/grey and grey/black reference scale, you can't find out.

Xmas said:
"Gamma correct" AA downsampling and display gamma correction should be two completely independent concepts.
They are, but in order to view these images correctly on an LCD, you need a fancy correction curve. I know my 9700P can't do that, and I rather doubt any consumer video cards can.

For LCD owners to see this, fire up Paint and make some custom colours, say RGB=225 (10% grey) and RGB=243 (5% grey). Draw some stuff and write some text on the white background. For me, it's very clear on a CRT. I can go right up to 1% grey on a white background and read it, though I wouldn't want to read a book...
 
Mintmaster said:
For LCD owners to see this, fire up Paint and make some custom colours, say RGB=225 (10% grey) and RGB=243 (5% grey). Draw some stuff and write some text on the white background. For me, it's very clear on a CRT. I can go right up to 1% grey on a white background and read it, though I wouldn't want to read a book...
I just tried this and am quite able to differentiate the colors. I even tried 1% grey. This was on my Dell 19" panel (not sure of the model number).
 
Mintmaster said:
They are, but in order to view these images correctly on an LCD, you need a fancy correction curve. I know my 9700P can't do that, and I rather doubt any consumer video cards can.
Fancy in what way? With dependencies across color channels, or requiring very high precision? Besides that, which limitations do you see for the correction curve?
 
Mintmaster said:
The current chart is good for finding a best fit gamma value, but that doesn't mean it's a proper ramp. A nasty S-curve will still have one ramp that's closest. However, by using a compromised greyscale in the white/grey and grey/black reference scale, you can't find out.
That is the primary intention of the charts, to find the actual gamma of your monitor.

The ramps obviously should show a scale over the entire range with neither the black blocked-up or the white washed-out. Those two patches over on the right are there to see if the black point is good.

Jawed
 
OpenGL guy said:
I just tried this and am quite able to differentiate the colors. I even tried 1% grey. This was on my Dell 19" panel (not sure of the model number).

Well since you're not complaining about a chain-link line, then your gamma curve must be fine! Correct me if I'm wrong, but I don't think Dell is competing side by side on a display floor in Best Buy to the average Joe who would be impressed by artificial contrast settings.

All I know is that if I go to Image->Adjust->Curves in Photoshop, and make a little S-curve, then the "chain link" effect comes up. I assume that's what these LCD viewers are seeing. I've seen this effect quite often on LCD's. Light-greys are not as dark as they should be, and dark greys are not as light as they should be.

Assuming the LCD users complaining of ATI's quality do indeed have gamma set to 2.2, then this is the only explanation I can think of.
 
Xmas said:
Fancy in what way? With dependencies across color channels, or requiring very high precision? Besides that, which limitations do you see for the correction curve?
The correction curve needs more control points to correct for what many LCD's are doing. To compensate for an S-curve, you need a tilde-esque input-output relation.
 
Mintmaster said:
The correction curve needs more control points to correct for what many LCD's are doing. To compensate for an S-curve, you need a tilde-esque input-output relation.
I'm pretty sure every card can do this (because gamma correction is usually done using a lookup table, and there's a GDI function to fill this LUT). The driver panel might be more limiting, though.
 
Mintmaster said:
The correction curve needs more control points to correct for what many LCD's are doing. To compensate for an S-curve, you need a tilde-esque input-output relation.

I've been playing around with the wire image with the blown up portions in the gimp to see if adjusting the gamma curves helps. After unsuccessfully finding a gamma curve that works well without distorting the background, I've discovered:

1) The blackest portions of the anti-aliased lines have a 0,0,0 rgb value (ie pure black) or very close to it.

2) The lightest portions of the "line" are indeed much lighter than the black portions. They hover around rgb: 165,165,150.

Based on this, I don't think this is actually a problem with gamma on these screens. I think the problem is more related to the thinness of the line as OpenGL guy was talking about. Looking at the blown up image below, it looks like both nVidia and ATI's implementation suffer from the "dotted line" effect. It looks like the black areas are where the line is actually being drawn, and the empty areas are where the line is between the two pixels. nVidia's implementation seems to be getting more samples from the line and less from the background (or perhaps it is simply the lack of gamma correction leaving the pixels darker?) so it is drawing in the gaps darker, while ATI's implementation seems to be sampling more of the background and less of the line or perhaps adjusting it to be lighter due to gamma correction.

Here's a blown up screenshot at 6X magnification without interpolation:
aa6xzoom8wn.png


Nite_Hawk
 
Thanks nitehawk.

well, if you're a pixel artist, when you want to smooth out a line you can choose a different number of colors for the transition, 2, 4, 8, 16, 10? whatever you want.

While both ihv's use four colours for black to the background colour, ati's colour values are essentially half that of nv's.
So while it essentially renders a better "phone line" on lcd/plasma the overall quality from a normal viewing perspective is essentially less since the object stands out more than the background..
That's why 2x and 4x go to ati (generally)

you can see by the stepping of the light pixels in the ati shot, the boundaries of the pixel touch eachother (light top/ light bottom) and there's one pixel distance between nv's lightest pixel stepping..
 
Why should the display gamma matter during rendering?

The whole point of a color management system (e.g. the ICC Profiles for your monitor) is that you work in a defined color space, like sRGB, and the operating system handles the rest by using the ICC profile to map sRGB into the gamut/gamma of your display.

So if the GPU renders in sRGB colorspace (gamma = 2.2), shouldn't the driver be able to apply an ICC correction post-process?
 
Most systems don't have the monitor calibrated at all. That means there is no valid colour-management for monitor rendering.

Jawed
 
The ICC profiles are generated for factory defaults. I see no reason why this should work on average. Unless DX doesn't apply ICC, only GDI.
 
Status
Not open for further replies.
Back
Top