[H]OCP does Radeon9700 I.Q....

DaveBaumann said:
Ummm, I'm pretty sure they have actually.

As for the gamma stuff, read the review.

Interesting. It appears that the 9700 applies gamma settings at pixel out (apparently with a decent curve...), as opposed to applying gamma at RAMDAC out. It makes sense that this would result in a higher-quality final image, and is, quite possibly, the only way to do higher-precision gamma when using a 32-bit framebuffer.

But it still has little to nothing to do with edge AA quality.
 
McElvis said:
I'm not sure if anyone else has noticed this, but why in Microsoft Flight sim does the Radeon 9700 has lots of shadowing that's not in either 8500 or GF4 pictures? Was a setting changed?

They do have it - the light source is just in a different place so its not as easy to notice.
 
Chalnoth said:
But it still has little to nothing to do with edge AA quality.

Errr - yes it does. It means the resultant averaged pixels will slope better according to your monitor settings. Thats the entire point of doing it. :rolleyes:
 
DaveBaumann said:
Errr - yes it does. It means the resultant averaged pixels will slope better according to your monitor settings. Thats the entire point of doing it. :rolleyes:

Gamma is still applied via a similar curve in every video card out there. The main difference here is in the accuracy of the gamma. I have a really hard time believing that improved accuracy of gamma can significantly improve edge AA quality.

I can see how it could improve overall image quality (though mostly when using wildly-adjusted gamma settings...), but not edge AA quality.
 
Is it possible to turn it off?

Thats pretty funny. It can't be shown in screenshots, and can not be turned off. Impossible to compare in other words :D

What happens if the user has two monitors of different types. How can it optimize then?
 
Hrm, I guess I'll have to change that.

Since the Radeon 9700 uses a multisampling algorithm, unless they dither the multiple samples output, the only improvement would be at the edges. I suppose it's something I'd need to see...though, at the same time, it's also combined with so many other things (such as sample pattern) that it would certainly be very hard to pick out the improvement based on per-sample gamma.

Oh, and btw, if enabling FSAA actually improves the accuracy of the gamma settings in the R9700, then you bet it will be visible in screenshots.
 
Galilee said:
Is it possible to turn it off?

Thats pretty funny. It can't be shown in screenshots, and can not be turned off. Impossible to compare in other words :D

If you haven't changed the ingame gamma settings then to all extents and purposes it is 'off'.
 
Well, NVIDIA doesn't do it, so naturally it must be useless, right?

Seriously, from my understanding the gamma correction on the 9700 is calculated during the FSAA application, such that the color value of a blended pixel is not a linear interpolation of the samples, but a weighted value based on the samples and a gamma curve.

No other hardware does this, AFAIK. The result is that the blended pixels should be more natural to the eye, which should improve the visual quality of the AA.

I'm no 3d guru, but this was my understanding. It doesn't seem that complex, really.
 
Can we just retitle every thread to "Chalnoth and Galilee try their best to slam the 9700 while the rest of us watch and laugh"?

(sorry for the additional waste of bandwidth, but it's getting absurd around here)

Mize
 
?
I am just curious. I am sure it's a great feature. Unfortunately I can't see it in screenshots, but I'll take ATI's word for it :D (just kidding)

What is this gamme curve anyway? Just a basic thing? or something from my monitor drivers? What happens if I use Plug & Play drivers for my monitor and if I have two different monitors.
 
Mize said:
Can we just retitle every thread to "Chalnoth and Galilee try their best to slam the 9700 while the rest of us watch and laugh"?

(sorry for the additional waste of bandwidth, but it's getting absurd around here)

Mize

hump.gif


And I get accused of ruining threads...sure.
 
Doomtrooper said:
Mize said:
Can we just retitle every thread to "Chalnoth and Galilee try their best to slam the 9700 while the rest of us watch and laugh"?

(sorry for the additional waste of bandwidth, but it's getting absurd around here)

Mize

hump.gif


And I get accused of ruining threads...sure.

Hehe, look what the cat dragged in.
 
Oh come on guys! Let's all be friends! ;)

Seriously though, this is getting rather annoying and since there can't be any more complaints about me since I don't do my thing anymore, why don't we just stop this and be reasonable?
 
DaveBaumann said:
If you haven't changed the ingame gamma settings then to all extents and purposes it is 'off'.

I tend to disagree with this. I think it depends how "linear" the default gamma settings are; if they are "non-linear" to any significant extent, the difference will be there compared to the old way of doing things.

The difference should be most obvious when antialiasing high-contrast polygons. Instead of the blended pixels having the 'mathematical' average value, which is likely to be an easy-to-discern intermediate color, they will have the 'gamma'-average value, which the eye should blend together more readily.

It will also be most obvious with a lesser degree of anti-aliasing, because then there will be fewer intermediate colors anyway.
 
Chalnoth said:
Gamma is still applied via a similar curve in every video card out there. The main difference here is in the accuracy of the gamma. I have a really hard time believing that improved accuracy of gamma can significantly improve edge AA quality.
Oh well.
I can see how it could improve overall image quality (though mostly when using wildly-adjusted gamma settings...), but not edge AA quality.
The whole point is that is helps edge AA quality.
 
Galilee said:
It can't be shown in screenshots...

I thought it could be shown in screenshots, just that the gamma corrected blended pixels may not match the gamma curve on another monitor. Therefore, the weighted color values may or may not make the edge AA quality visually better than a non-adjusted linear interpolation.

It was my understanding that this gamma correction was applied per pixel before the framebuffer, and would be seen in screenshots. I could be terribly wrong however. Perhaps someone with more intimate knowledge of the hardware could clarify this for us.
 
Back
Top