Would it be wrong to say AF is the AA for textures?

My assumption was that the hardware would be designed to deal with it, it wasnt meant for present hardware.
 
OpenGL guy said:
Here's why this stuff doesn't work well. Imagine doing texkill instead of alpha test. End result is the same if you use texkill to kill texels based on alpha value. Now, if you say the driver should enable SSAA in this case, then I say, "What if no pixels are killed? You're incurring a huge cost for no reason!" Of course, this applies to alpha test as well.
New APIs should support switching, because the application knows best when it's necessary to calculate several samples per pixel. Not only for alpha test, but also for high-frequency shaders. For older games, many people would be happy if they could get rid of the annoying alpha test shimmering, even if that means a performance hit (and higher texture quality btw). Something like
if alpha_test and not alpha_blending: enable supersampling
would be far from perfect, but it would help.

What I thought you folks were talking about was a dynamic (i.e. hardware based) MSAA -> SSAA transition. However, even if you assume it's the driver that makes the switch from MSAA to SSAA and back, I still think you are making some very large assumptions. That doesn't mean the problems can't be solved, but there's a lot more involved than just some driver tweaks.
What would be the big difference between hardware and driver based decision? Both could use the same criteria - other render states. You cannot decide per pixel, because that would require doing all the supersampling calculations anyway.

Even on today's hardware the driver could decide to use supersampling (via multisample masking) with alpha test polygons. Very costly, but may be worth it.

We certainly need shader antialiasing in the future. Gradient instructions might help, OTOH switchable supersampling pretty much always works, is rather simple, but costly.
 
OpenGL guy said:
Here's why this stuff doesn't work well. Imagine doing texkill instead of alpha test. End result is the same if you use texkill to kill texels based on alpha value. Now, if you say the driver should enable SSAA in this case, then I say, "What if no pixels are killed? You're incurring a huge cost for no reason!" Of course, this applies to alpha test as well.

What I thought you folks were talking about was a dynamic (i.e. hardware based) MSAA -> SSAA transition. However, even if you assume it's the driver that makes the switch from MSAA to SSAA and back, I still think you are making some very large assumptions. That doesn't mean the problems can't be solved, but there's a lot more involved than just some driver tweaks.
Automatic enabling of the switchable supersampling (say, when alpha test is enabled) may not be a good way to implement things on the driver side, but would certainly be possible.

The biggest problem, of course, is that the vast majority of software relies upon video card drivers to enable FSAA, offering no options within the program itself. This sort of technology would work best when enabled by the software.

And of course there would be a performance hit, but sometimes it just isn't possible to use a smooth step function. In the alpha test example, an alpha blend is no longer order independent. Obviously using a smooth step function is better if possible, but using the smooth step function may produce too much of an overhead (an alpha blend, depending on the scene, may be untenable if there are a large number of alpha blend surfaces that need to be sorted...).
 
Back
Top