Why AA doesn't get the respect it probably deserves

I'm going to be pedantic and point out that AF is form of AA.
I'm trying to understand what you mean here: if a signal (texture) has been correctly prefiltered and if the mip map selection scheme is also correctly implemented then AF shouldn't be regarded as an AA process, right?
 
IMO aa does more for visual quality than af by a noticeable margin. shimmering and crawling is rly annoying

Which are results of underfiltering, meaning that some driver is skipping on basic texture antialiasing or else known as trilinear when the application specifically calls for it.

Unoptimized AF does NOT create any shimmering or crawling; if side-effects like that appear it means that either the developer doesn't have a clue of how to use proper MIPmapping or there's shader aliasing appearing. For the first not even Supersampling can help (best available texture AA thingy by far IMO) and for the latter only an insane amount of shader AA within the application could save the day.

Given the topic here I don't see any developers seriously bothering with shader AA anyway.
 
Which are results of underfiltering, meaning that some driver is skipping on basic texture antialiasing or else known as trilinear when the application specifically calls for it.

Unoptimized AF does NOT create any shimmering or crawling; if side-effects like that appear it means that either the developer doesn't have a clue of how to use proper MIPmapping or there's shader aliasing appearing. For the first not even Supersampling can help (best available texture AA thingy by far IMO) and for the latter only an insane amount of shader AA within the application could save the day.

Given the topic here I don't see any developers seriously bothering with shader AA anyway.

by shimmering i meant edge/polygon shimmering, not texture shimmering.
 
i still think theres no reason devs shouldnt code their games so that control panel aa works. they have been doing so for years. games like r6 vegas, stalker etc rly frustrate me with their lack of aa support.

Sure, games were compatible with control panel AA for a long time. They also didn't do HDR or postprocessing or any of the many other techniques that requires the primary scene rendering to go somewhere other than the backbuffer; nor did they use complex shaders or lots of lights, making deferred rendering techniques important.

Control panel AA worked in the past because there was only one rendertarget that needed AA: the backbuffer. With app-controlled AA, the game tells the driver which surfaces to apply AA on, so the memory size and perf hit only happen for surfaces that really benefit from AA. With control-panel AA, the driver has to guess which rendertargets matter -- how do you do that? It could do it on all rendertargets, but that would waste tons of memory and performance for no gain, or it can only do it on the backbuffer, which doesn't work if the primary scene rendering goes to some other rendertarget.

Control panel was useful hack for when AA was still novel enough to not expect game devs to support it, and for legacy games using APIs that didn't even have AA. But it's been widely supported and available for a very long time now, can we please stop relying on this crutch?

Count me in to the won't-play-it-if-it-doesn't-have-AA crowd.
 
hdr is no excuse for not supporting aa, look at oblivion, far cry, test drive unlimited etc. im also guessing crysis is very likely to support aa.
 
while i'm not quite in the "no AA = trash" camp, i am very sold on AA, and i think a lot of this debate revolves around education of the average consumer. I'll wager most people buy graphics cards expecting them to take their games to XXXX vaunted resolution, not to apply effects like AF and AA. Even amongst the diehard gamers I've been preaching to for years, only one or two of them really have paid attention to these details. So I think devs take some of the flak when their game forces the noobs and the clueless to game at lower-than-anticipated resolutions. However, you and I know that for most games, AA at a lower resolution is nicer looking than a higher resolution -- especially for those system-killers that bring you down to 10x7 or (gasp!) 800x600. So, the solution here would seem to be educating moron consumers (or waiting for the baseline hardware to be so fast at AA that it doesn't matter). I really would like to see som e major devs take an extra couple of man-days to include some tooltips -- or a brief tutorial-stye demo would be awesome -- but even just some long tool-tips about what each setting in the Video Options page does. What it really, really does... this might really enlighten a lot of people. Of course, with all the multiplatform games, and console gamers by and large not knowing anything about AA or AF...

[edit: the other solution i guess, is grassrootsier. sites like b3d are invaluable in helping people understand. When I first started coming here religiously, it was to find out more about different AF implementations... preach it in the streets! No new grpahics! just better stories and more AA! ;)]
 
1 option is just to never have aa default to on under any circumstance, and have it be a checkbox option that requires manual activation with a popup telling them about performance loss.

maybe even require it be enabled in the ini file, so only people who know what they are doing can enable it
 
Many of the graphics old-timers here can confirm how enthusiastic I have been about AA.

However, most of the PC games I have bought over the past couple of years were because I was primarily intrigued by the graphics. Besides AA, one of the most expensive 3D features in most of the latest games are the shadows. Disregarding the fact that one of these two 3D features is more important to a developer (it's a major design decision made very early and the game's entire gameplay may be dependant on this feature), I personally have found myself having absolutely no hesitation in preferring to enable the highest shadow mode available in the game and disabling any AA to get better performance.

A number here have stated they wouldn't play any game without enabling AA. I actually find this hard to believe in this age of enhanced shader effects but this is just my purely subjective opinion. Which is to say, the current shader and texture effects are generally lousy or not good enough for you to be able to ignore aliasing. I will leave gameplay out of this even though gameplay probably has a larger influence.

To be honest, I don't actually know what I'm trying to get across. I'm a little troubled by the fact that I have pushed AA back in terms of my priorities but I cannot deny that I look out more for great textures, great shadows and neat shader effects than I do aliasing. Which probably leads to the philosophical question : Do you actually look out for aliasing or is aliasing simply intrusive naturally to you?
 
Oddly enough I'll kill off shadows entirely in a heartbeat to get enough performance for a decent AA level. :LOL:

Different strokes for different folks I reckon.

EDITED BITS: Aliasing is intrusive as hell for me personally, I don't go looking for it so much as it jumping out and smacking me in the face.
 
reverend the issue is, we arent getting great effects, lighting, shadows or textures to justify the design decision not to support aa. we are just getting very mediocre or dated graphics that dont support aa.
 
hdr is no excuse for not supporting aa,[...].
Oh yes it is IMO. This combination is possible now but if it wasn't, it goes right back to my original post : Why did devs implement HDR if it couldn't be used with AA at that time? To give everyone a choice? No.
 
reverend the issue is, we arent getting great effects, lighting, shadows or textures to justify the design decision not to support aa. we are just getting very mediocre or dated graphics that dont support aa.
AA is supported by most developers nowadays (and has been for the past few years actually). Menus in a lot of games allow you to enable/disable/specify AA. That means support for AA by developers in my books. You have sidetracked the thread, which wasn't about developers supporting AA (see what I wrote in this post; they do); it's about why developers should care about something they cannot control; why you think they should; what you think can be done to minimize the trouble presented to the developers regarding AA; and why you think the developers should not use any kind of feature that meant lower-than-satisfactory performance if AA must be applied.

Many more pixels than 1600x1200 without any AA may render this thread obsolete but we're not there yet, right? That's why I said this really is a big open question! :)
 
they can control it, some games support aa, but only the aa selectable thru the in game menu, control panel aa has no effect. trackmania united for example. the problem with this is that u only get aa on the edges the developers choose which sucks. i want to enable FULL SCENE aa as i deem necessary. imo devs should just design their games so that control panel aa works. that gives the user complete control over how many samples they want, and devs dont have to worry about supporting various formats of various video cards. its up to the user to determine how much performance he wants to sacrifice.

im not saying devs should scale back features so games run faster with aa, im saying their games shouldnt be incompatible with aa.

i think ud need at least 10 fold increase in resolution to monitor size scaling over current lcds b4 aa becomes useless
 
Last edited by a moderator:
Back
Top