[H]OCP does Radeon9700 I.Q....

OpenGL guy said:
(to Chalnoth)...take the words of people who have said that the Radeon 9700 gives better AA quality than the GeForce 4 (of course, you won't).

The 9700s AA is clearly superior to everything in the consumer market today. Until I got this card nothing had surpassed (or even tied) the 4xRGSS of the Voodoo 5 (granted it was too slow to be of much use) - this does and noticeably (I still have a V5 rig). The fact that Chalnoth cannot accept this doesn't seem to degrade the AA quality one bit.

So how's about that forced 32 bit feature OGg?
 
Mize said:
So how's about that forced 32 bit feature OGg?
It's a non-trivial task to force 32-bit...

Most games that don't support 32-bit are rather old, which means they use rather dated rendering techniques (LOCK being a big one). If the application were to do a lock, we would have to do several blts to convert the data to 16-bit, let the app do it's thing, then convert back to 32-bit. Since this will likely be happening on a per frame basis (and who's to say they won't do more than one lock per frame... it happens), it doesn't lend itself to good performance. Of course, this is just one example of where forcing 32-bit would be an issue.

Another thing is that if the game uses GDI to draw things (like text or menus) then you'll have problems as well...

These sorts of issues can make for a support nightmare... which is why we're still looking at other options. We don't want to risk the driver stability which we worked so hard to acheive by creating more problems for ourselves.
 
I would expect gamma to have a big effect on edge anti-aliasing due to the fact if the gamma is set wrong the number of available colors to anti-alias with would go down giving you bigger transitions differences. The gamma has to be corrected to the monitor output to maximize the number of colors viewable. I could be wrong but that makes sense to me.
 
Chalnoth said:
Interesting. It appears that the 9700 applies gamma settings at pixel out (apparently with a decent curve...), as opposed to applying gamma at RAMDAC out. It makes sense that this would result in a higher-quality final image, and is, quite possibly, the only way to do higher-precision gamma when using a 32-bit framebuffer.

But it still has little to nothing to do with edge AA quality.

AA has everything to do with gamma. There's this test image of a bunch of AA"d line radiating out from a point to form a circle. I wish I had it handy then you could see for yourself.
 
Mize said:
Chalnoth, I think you're a bit confused about Hyp-X's' explanation.
Perhaps a re-read would be prudent.

Mize

Hyp-X's explanation is obvious. "Gamma-corrected" AA is done so that it averages final output values (I.e. it averages colors that you would see on the screen, not colors stored in the framebuffer).

Additionally, if you go back and look at the equations, you'll realize that if a gamma value of 1 is selected, then "gamma-corrected" AA will look exactly like non-corrected AA.

This is what I was trying to say: There should be a gamma setting on every video card where the FSAA would still be "gamma correct." By adjusting the gamma wildly on my GeForce4, I was unable to find a gamma setting that didn't result in showing the "dotted line" artifacts.

If anybody with a Radeon 9700 wants to prove the point, try this.

Take a screenshot of a game in wireframe mode (Morrowind is ideal...the command in the console is simply "twf"). For comparison's sake, 2x FSAA would be ideal. Before exiting the game, see how apparent the "dotted line" effects are, if you can see any.

After exiting the game, the final screenshot may show different "dotted line" effects (If the Radeon 9700 is indeed doing it optimally, it should look a bit worse...in the screenshot...since in-game gamma and desktop gamma are usually different). Please try to adjust the gamma with a photo-editing program to make it look closer to the way it looks in-game, if necessary.

If the final shot can offer significantly less "dotted line" artifacts than I see on my system, I will be convinced.

However, currently I am very skeptical that these particular artifacts will be solved all that well. That is, consider this: white line on black background (as shown by Hyp-X). The final output value of the "wider" portion of the line should always be halfway inbetween the black and the white when seen on the screen. Without proper gamma correcting, it may not be. That is, even with proper gamma, it seems that the "dotted line" artifact should still result.
 
Chalnoth, somehow I think that no matter what anyone shows you, you still won't believe it. I don't know where I'd get that idea. Guess I'm funny that way. :rolleyes:
 
If the final shot can offer significantly less "dotted line" artifacts than I see on my system, I will be convinced.

And therein lies the rub.

Somehow, I get the idea that Chalnoth's idea of "significant" is a tad bit different from the mainstream....at least when it comes to someone having an "advantage" over the current nVidia implementation...
 
Someday you guys will learn you just simply can't argue with someone that has absolutely no clue about what they are talking about.

It's just not prudent to have the product sitting right in front of you, describing a definate and distinct variance, when there is someone else that has never, ever even been in the same county as one arguing against it.

Just go look at the various V5, 8500 and other non-NVIDIA product threads where the person in question has argued.

Hell, even when given the GF3/GF4, somehow he invents variances where none exists, or argues against variances where they are clearly illustrated... provided it is in favor of a single IHV (guess which)..

[Edit- and thanks for the GF4 4x->8x shots. They clearly illustrate what I already proved with benchmarks and screenshots in another thread where Chalnoth simply tried to rationalize away the reality of the situation. And no, the setting IS taking as benchmarking 4x/8x shows a distinct and VERY measurable change in performance]
 
Joe DeFuria said:
And therein lies the rub.

Somehow, I get the idea that Chalnoth's idea of "significant" is a tad bit different from the mainstream....at least when it comes to someone having an "advantage" over the current nVidia implementation...

Joe, if Hyp-X's screenshot is correct, then the difference should be extremely apparent.
 
It must be 'bag on Chalnoth day'. He says a lot of poop that deserves bagging on, but his inability to recreate the gamma diminishing act isn't necessarily one of them.

It does stand to reason that you should be able to gamma correct an image with AA'd lines to even out the intensity of the lines on a GF4 (or any video card). Instead of bagging on him, why not help him see why what he's doing doesn't work.

By the way chalnoth, I'm not following you here:

The final output value of the "wider" portion of the line should always be halfway inbetween the black and the white when seen on the screen. Without proper gamma correcting, it may not be. That is, even with proper gamma, it seems that the "dotted line" artifact should still result.

Why is it that if you have a proper gamma correction the dotten line artifact should still result? (assuming it makes the 'average intensity' of the line equal at all points on the line)
 
Mize said:
The 9700s AA is clearly superior to everything in the consumer market today. Until I got this card nothing had surpassed (or even tied) the 4xRGSS of the Voodoo 5 (granted it was too slow to be of much use) - this does and noticeably (I still have a V5 rig).
Wow,it actually surpasses your V5 in quality I am even more inclined to get me one but sadly since most of the games that I have are kinda old in which they support only 16 bit & some even 8 bits alone,so a 9700 is no,no......for now.

It makes you wonder also if this is also the reason for V5's having good anti-aliasing due to the gamma settings provided. :-?
 
RussSchultz said:
Why is it that if you have a proper gamma correction the dotten line artifact should still result? (assuming it makes the 'average intensity' of the line equal at all points on the line)

Actually, I'm not necessarily certain that the "dotted line" artifact should still be apparent, at least not from a distance.

That is, it shouldn't be apparent if a line of twice the width with half the brightness looks just as bright as the narrower, brighter portion of the line, from a fairly large distance, of course. Looking up-close is meaningless here.

To tell you the truth, I don't know what would be optimal in this situation. What does seem apparent, however, is that if plain gamma correction, as suggested by Hyp-X, is done, and it doesn't get rid of the "dotted line" problem, then a different algorithm should be used for line AA than for FSAA/edge AA.
 
Well, it's a slider that adjusts the gamma quite smoothly from very dark to very bright (and updates in realtime). Unless the real "1.0 gamma" setting is much darker than the lowest in-game gamma setting, then I should most definitely have seen the lines "smooth out," as it were. I will admit that that's not out of the question...as brighter gamma settings show very noticeable bit depth loss.
 
On the image that Hyp-X put up, I could adjust the gamma so that at a distance I did not notice any dashed line artifact.

Of course, at that setting, everything else was pretty washed out(or at least 'different' than I'm used to)

Which, points to the benefit of this gamma corrected anti aliasing.

Assuming you've got your monitor calibrated (yeah...right), it should pick values that average the intensity and not muck up anything else.

Of course, those screen shots might look like ass on another monitor.
 
I could too, Russ. This is the main reason why I'd really like to see a wireframe shot from the 9700.

I suggested Morrowind because of the reasonably-large polycounts, nearly any shot will result in a large array of angles for the lines, but few shots will have too many triangles to make out individual lines. Of course, outdoor scenes work perfectly for this.
 
Chalnoth said:
Additionally, if you go back and look at the equations, you'll realize that if a gamma value of 1 is selected, then "gamma-corrected" AA will look exactly like non-corrected AA.

Actually no. There are two gamma values we are dealing with here, the actual currently set gamma value, and the gamma value the monitor need in order to produce linear results. You need to take both into account. Currently the drivers just assume that the gamma should be roughly 2.2 (or maybe 2.4, I don't know), and reads the current gamma in (or possibly assume 1.0).

Hold on, I'll add more material in a minute. :)
 
Topic on the subject on OpenGL.org

http://www.opengl.org/discussion_boards/ubb/Forum3/HTML/007176.html

The topic started is a ex SGI employee, if you visit opengl.org frequently you'd know that he's has very deep and broad skills. He's very excited about the feature, and for a good reason of course. Fortunately he has load of data to back up his claims for all non-believers. :)
A very interesting read, certainly recommended.

For anyone still not converted ... I'll add an illustrating example in a minute ... hold on :)
 
Thanks, Humus, that was a really good read.

And it just makes me want to see a wireframe shot of the Radeon 9700 with 2x FSAA all the more....anybody?

Update: Argh, shoot. I just tried to take a screenshot in Morrowind, and the shot taken didn't show the FSAA at all. I wonder if the same will happen with a Radeon 9700? If so, that shoots the idea of using Morrowind out the window...but Quake3 does have a wireframe mode (I doubt I remember the command correctly, but I think it was r_show_tris 1 or something similar).

Anyway, the jist of what I read there was this:

Most computers use a hardware gamma of plain 1.0, and, by some miracle, this looks proper to the human eye. However, when displayed on the monitor, the actual brightness is not linear. A gamma of approximately 2.5 would produce a linear brightness on most monitors. The idea is to transform into this "linear brightness" space, do the FSAA averaging, and then go back.

However, it is an absolute must to do this type of gamma calculation at higher than 8-bits per color precision for it to look good.
 
Back
Top