What's happened about texture AA?

Shifty Geezer

uber-Troll!
Moderator
Legend
Having been given an Radeon 9600 to upgrade my Ti4200, I thought I'd try out GuildWars to see what difference I could get with AA and framerate. The jaggy textures were of course still apparent, and suddenly I remembered talk of nVidia doing texture AA (transparency AA) to remove that. I'm pretty sure it was announced with the 7800. So, is it being used in games? Is it a PS3 only thing, or can XB360 do it too, or can neither do it, or is it too darned expensive to use? I find transparency jaggies look even more out of place in AA'd screens, and am wondering why they're not being removed.
 
they should both be able to do it, all nVidia did was add it to the driver, I would guess that any dev that wanted that feature could code it themselves the same as any other AA algorithm

how taxing it would be would be another question, one that I can't answer
 
they should both be able to do it, all nVidia did was add it to the driver, I would guess that any dev that wanted that feature could code it themselves the same as any other AA algorithm
That's the correct answer.

What the engine need is to support alpha to coverage, in other word supersample the alpha channel of the texture and then distribute the obtained data to the MSAA buffers. I think Humus has a demo of that tech on his site.
 
Alpha to coverage does not involve supersampling. It just converts the alpha value (calculated as usual) into a coverage mask that masks out writes to subsamples.
 
Alpha to coverage does not involve supersampling. It just converts the alpha value (calculated as usual) into a coverage mask that masks out writes to subsamples.
I need to read about it, then.
Not that I was ever interested in the technique in the first place, but from what I gathered skim reading threads about it, I was sure that it did involve some form of supersampling.
 
I need to read about it, then.
Not that I was ever interested in the technique in the first place, but from what I gathered skim reading threads about it, I was sure that it did involve some form of supersampling.
I think there were 2 methods. The lower quality method that used alpha to coverage and the higher quality method which used supersampling. I'm not sure of the details.
 
Having been given an Radeon 9600 to upgrade my Ti4200, I thought I'd try out GuildWars to see what difference I could get with AA and framerate. The jaggy textures were of course still apparent, and suddenly I remembered talk of nVidia doing texture AA (transparency AA) to remove that. I'm pretty sure it was announced with the 7800. So, is it being used in games? Is it a PS3 only thing, or can XB360 do it too, or can neither do it, or is it too darned expensive to use? I find transparency jaggies look even more out of place in AA'd screens, and am wondering why they're not being removed.
I'm not sure about any limitations on the 9600, but I could enable transparency AA on a Radeon x700 just fine. I'm not sure if it matters, but I use the venerable ATI Tray Tools to wrestle with my driver settings. I have given up on ATI's own control panel efforts a long time ago.

The more efficient solution on the game side would be blending or explicit alpha-to-coverage. Forcing selective supersampling on a game externally may work, but it's pretty wasteful and wrong engineering™.
 
So this is a common feature that's not too demanding? Any particular reason it's not being used, and FSAA screens are blighted with texture jaddies still?
 
A lot of 360 games use what appears to be similar to transparency AA although it does it by simply providing a stipple based effect similar on the edges to the way the Saturn did transparency.
 
Back
Top