MSAA and alpha textures

Initially the ATI Radeon 9700 Pro was rumored to be able to antialias alpha textures in spite of the fact that they utilize multisample antialiasing.
Well, I was quite disappointed to learn that not only was this feature not available upon product launch, even if it had been it would only have been available in OpenGL. Many games currently have a lot of alpha textures and I would imagine they will continue to be used.

A few quick questions:

1) Why is it possible on the 9700 in OpenGL only, and how is it done?

2) Why have we not seen a graphics chip that is able to both AA alpha textures and use multisample AA?

3) What are the chances that NV30 might pull this off?

Regards,

Lincoln
 
One way to antialias alpha textures is to set the alpha test threshold very low and convert the alpha value into a sample coverage mask instead of using alpha blending. That is, if alpha is high, all samles get written to, if it's 50% then only half of the samples are used etc.
The result is actually not very different from alpha blending, it's lower quality but faster since it needs no read-modify-write cycles.

This method is part of the GL_ARB_multisample extension, so it should be supported on other existing chips, too.
I don't like this one very much, because I think alpha blending is better quality-wise (though this works order-independent with a few flaws, at least better than alpha blending, so it could be forced by the driver)

Another way would be to enable supersampling when an application uses alpha test. It's the better way imho.

Either way, I can see some people complain about blurry fonts in their games ;)
 
Well, it would be possible to effectively enable super-sampling when the alpha test is enabled without inducing a fillrate hit by just doing the alpha test multiple times (only filtering the alpha value...probably at 8-bit accuracy...once for each sample), and only writing those pixel sub-samples that pass.

Btw, blurry fonts generally don't happen with an alpha blend because the texture samples are about the same size as screen pixels. So, you essentially get anti-aliased text, not blurry text. Blur would only happen if the texture pixels are larger than screen pixels.
 
Chalnoth said:
Well, it would be possible to effectively enable super-sampling when the alpha test is enabled without inducing a fillrate hit by just doing the alpha test multiple times (only filtering the alpha value...probably at 8-bit accuracy...once for each sample), and only writing those pixel sub-samples that pass.
Well, that would still mean sampling the textures multiple times (and you cannot read alpha separately) and doing all the shader calculations involved. So there is a fillrate hit. And if the shader uses swizzling... you get the point. there are quite a few problems with this approach.

Btw, blurry fonts generally don't happen with an alpha blend because the texture samples are about the same size as screen pixels. So, you essentially get anti-aliased text, not blurry text. Blur would only happen if the texture pixels are larger than screen pixels.
I put that ;) there for a reason. Though you're right, people tend to call this "blurry", and it's really not always better.

btw, I am using ClearType on my tft display, and I really like it :D
 
Xmas said:
Well, that would still mean sampling the textures multiple times (and you cannot read alpha separately) and doing all the shader calculations involved. So there is a fillrate hit. And if the shader uses swizzling... you get the point. there are quite a few problems with this approach.

Not necessarily. While it would definitely be somewhat lower-quality, you could just do different weighted averages on the exact same texture data instead.

And, of course, it would definitely fail whenever PS were used, making this hardware only good for older games, essentially. At the same time, as long as all channels are calculated independently, it should be possible to implement, say, multiple pipelines for 8-bit alpha calculations, as many as samples are supported with multisample AA. But...this is probably more unlikely.

But...with respect to the 9700, I believe I remember an ATI employee stating that it supports the "AA'd alpha" through an OpenGL extension (Don't remember which...), so what I was stating is mostly theoretical, and will probably never come to fruition.
 
Chalnoth said:
But...with respect to the 9700, I believe I remember an ATI employee stating that it supports the "AA'd alpha" through an OpenGL extension (Don't remember which...), so what I was stating is mostly theoretical, and will probably never come to fruition.

it works, but in some OpenGL games like Half Life, that extension is used for a completely different purpose. Sorry, but the extension has slipped my mind... something about making hogies for 5 hours that kills the memory LOL
 
Hopefully Dave will touch this topic in his 9700 follow up article, cause this is my major concern about the 9700.

Would like to know whether ATI plans to add SSAA to the drivers. That would kinda fix the problem, probably with a bigger performance hit, though...
 
There are several good reasons for avoiding alpha test at all nowadays - most notably because it can cause problems for Z bandwidth reduction features.

With the geometry rates of the R9700, in many cases you might be as well off doing the same thing with geometry rather than alpha test. Of course, there are cases where alpha test is still useful.
 
From Dave's article. He mentions the OpenGL call.
It's previously been mentioned that ATI's Multisampling scheme had a method of applying Anti Aliasing to Alpha textures, a potential issue that Multi Sampling FSAA overlooks. It transpires that R300 supports the OpenGL 1.2 alpha coverage mask call, which can be used to convert Pixel Shader output to anti-aliased coverage. If this were forced on then there could be some issues with older titles, such as Half Life, which already use this function for different purposes. At the moment ATI have not included an option to force this via the driver but may still at some point. There is presently no DirectX equivalent as yet so this would be specific to OpenGL titles at the moment anyway.
 
Let's see if I understand it correctly. (Correct me if I'm wrong.)

Say the application does alpha-test cutting at 50%.
Say we do 2x AA.
With simple MSAA:
- if the pixel alpha is below 50% the coverage is 0 (none of the 2 samples affected)
- if the pixel alpha is above 50% the coverage is 2 (both 2 samples affected)

With coverage masks we could use 0, 1 and 2 coverage at most with this level of AA.
So how does it do it? It needs new cutting values doesn't it?

Say:
- pixel alpha is below 33% --> coverage 0
- pixel alpha is between 33% and 67% --> coverage 1
- pixel alpha is above 67% --> coverage 2

This works - but the problem is that it's not so hard to create a texture that has 40% alpha at its transparent areas - that would look really bad with this method.

I'm pretty sure that this method would introduce artifacts.
 
Alpha to Coverage is supported on GF3/4, too. As those cards have a maximum of 4 samples per pixel, NVidia uses dithering to increase the number of possible transparency levels to 16. I'm not quite sure if that's a good idea...

If anyone's interested, I wrote a little multisample demo today which you can download from http://www.samx.de.vu/Multisample.exe

It's an OpenGL app that requires support for
WGL_ARB_pixel_format
WGL_ARB_multisample
GL_ARB_multisample
It has only been tested on GF3/4 cards, I don't know if it runs on other cards. You have to set antialiasing to 4x in the driver in order to get correct rendering, the application cannot force this.
 
Well, from what I gather, using "alpha coverage" cannot possibly produce better results than straight alpha blending.
 
Chalnoth said:
Well, from what I gather, using "alpha coverage" cannot possibly produce better results than straight alpha blending.
As I wrote in my first post in this thread, alpha to coverage is worse in quality (actually much worse, alpha blending edges are "perfect" as long as textures only get minified, not magnified).
However, alpha to coverage is order independent as long as you don't put multiple transparent layers above each other.
 
Xmas said:
alpha to coverage is worse in quality (actually much worse, alpha blending edges are "perfect" as long as textures only get minified, not magnified)

It seems to me that alpha textures ARE commonly magnified....
 
Only in older games, Althornin. Half-life, for example, has such low texture quality that at high resolution, it's hard for any texture not to be magnified.

A "newer" game like Unreal Tournament will usually minify alpha textures (Quick note: There are no alpha textures in UT that are 1024x1024 with the compressed textures pack that I am aware of).

The only glaring problem that I can see going into the future will be foliage that is meant to be moved through (i.e. a realistic army-type game where the soldiers crawl through the grass) or hidden behind (bushes). And even then, I have yet to see an alpha texture that used an alpha test that failed to look terrible when viewed up-close. I still fail to see why just doing the alpha blend would look worse.

As for depth ordering, it seems to me that it shouldn't be all that hard to use an alpha test for z-writing if the game seems to have a problem with depth-ordering.
 
Chalnoth said:
I still fail to see why just doing the alpha blend would look worse.

As for depth ordering, it seems to me that it shouldn't be all that hard to use an alpha test for z-writing if the game seems to have a problem with depth-ordering.
I also don't think alpha blending would look worse, and I'd prefer to have the choice between alpha test (faster, sufficient with supersampling) and alpha blending (better quality).

But I don't know what you mean by using an alpha test for z writing. The problem with alpha blending is that the result written to the frame buffer depends on the destination pixel color. So when you're rendering an alpha-blended polygon, everything that's behind it must already be rendered. You cannot "insert" something behind it afterwards.
This is usually done by rendering the opaque polygons first, and then rendering a depth-sorted list of transparent objects. Intersecting transparent polygons will always be rendered incorrect this way, so you must avoid them.

Oh, and regarding this sentence I wrote above:
However, alpha to coverage is order independent as long as you don't put multiple transparent layers above each other."
This means that you don't have to render alpha polygons after all opaque polygons, because a2c does not depend on the color of the destination samples. To be truly able to put several transparent layers above each other however, you need a massive amount of samples (16+) and a pseudo-random alpha to coverage algorithm that differs per triangle and per pixel.
 
Xmas said:
But I don't know what you mean by using an alpha test for z writing. The problem with alpha blending is that the result written to the frame buffer depends on the destination pixel color.

Right....you can't make it look "right" this way, but you can, at least, make it render something. This way, if your engine isn't particularly good at sorting transparent polys, or there are some intersecting polies, the parts behind won't just disappear entirely. They won't look entirely correct, but it should look better than them disappearing completely.

Of course, this would likely cost even more performance than just using a more rigorous sorting algorithm, but I suppose it might be done in one pass in some pixel shader versions...
 
Actually, I mentioned that same paper previously as an option.

It turns out that it requires a separate pass for each possible level of transparency:

The essence of what happens with this technique is that with n passes over a scene, we can get n layers deeper into the scene.

In other words, it's very expensive to implement for a game.

But I've been doing some thinking, and it may well be possible to use a variant of alpha blending that works with destination alpha to offer a limited form of order-independent transparency for alpha textures. Perhaps I'll post it in more detail after I've done some more thinking.
 
Back
Top