Too much AA?

If it looks like it's been smeared with vaseline then it's too much.


Its something i see more and more with SweetFX shaders... Peoples tend to over use it ( offtly just for create screenshots, or they really like play with 5fps ).. Remind me the first 3D scene we was doing with Autocad in the 90's. ( they was look totally unrealistic, like all was made of plastic )

Now, looking at the screenshots used in this post, i can allready tell you where lie the problem.. if you compare the far vegetation, no AA will result in sharpen impression ( Specially with the LOD applied ) .. the vegetation is so far you will not see the Aliasing, but this aliasing will give you a feeling of sharpness ... more AA dont allways lead to a sharper impression on far objects, the edge of objects who are far of you will allways look sharper without AA. ( just because the edge aliased are bigger at this distance of the edges who are aliased ) .. ( at the moment it is not a simple "line" ( like an electric cable ).

Never use vegetation on far rage as a comparaison point for AA. For far vegetation who are allready use a LOD applied on texture, no AA will result you see the edges of the vegetation, ( its somewhere define a texture borders, but this is the edge aliased ), and so it will look sharper... Dont forget the far environnement are normally use a different LOD, are subjects on effect like: DOF etc.
 
Last edited by a moderator:
Blurring is caused by excessive filtering removing high-frequency details present in the scene being rendered. However filtering is just one component of anti-aliasing, so a statement such as "too much AA leads to blurring" strikes me as over-simplified and somewhat misleading.
 
Last edited by a moderator:
Depned on what is AA. If you're doing rotated grid SSAA (call it sparse grid if you wish but most often your samples will be in the "rotated" position) : then this AA is better than sex.

You're really adding information, not smearing things. I used it a decade ago (albeit at 800x600 on an old CRT, and occasionnally 1024x768 for testing out or old game that could be run on a Voodoo1)
This allows to see much more clearly the counter-terrorists at a distance. You see something realistic instead of a pixel popping mess (even MSAA helps with this). A lot more physical detail.

You could apply a negative texture LOD : so ground textures, walls etc. become crazy sharp (get noisy if you push it too far) so I had a substitute for AF and even bilinear filtering was very bearable (my card didn't support AF and didn't really support trilinear)

And anti-aliasing on the 1bit transparencies (fences, and that vegetation), well, better with than without. That's why you may have a "transparency supersampling" option that selectively applies supersampling. The worst are power lines : there's no point of keeping them "sharp" if they constantly pop in and out of existence. You'd better AA them.

Also, on a nvidia card you may try running an old Direct3D game with the "3x3" supersampling (dumb, ordered grid) it's inefficient but magnificent.
So I think there's no such thing as too much supersampling (or too much MSAA, but I don't really see a difference between 4x and 8x MSAA. It gets useless and the noise it doesn't correct is still there)

With post-processing AA and all these "clever" techniques, you sure probably don't want to go too far. Their point is too make the image look less worse with a small cost. So they are about removing noise with adding no information, or the least amount of information possible (such as a combination of 2x real AA with clever techniques. An extremely dumb version of that was AA 2x Quincux over a decade ago, everyone hated it. But I found out it was good in Doom 3 (not much elsewhere unless you wanted that blur))
 
The question sometimes arises: whats better - more resolution, or more anti-aliasing at a lower resolution.
Perhaps if you choose that incorrectly then you can be accused of having too much anti-aliasing.

but if you have spare processing power and you're at the maximum resolution, i think you more often benefit from adding more samples especially if you start considering motion blur as AA over time..
 
The question sometimes arises: whats better - more resolution, or more anti-aliasing at a lower resolution.
Perhaps if you choose that incorrectly then you can be accused of having too much anti-aliasing.

but if you have spare processing power and you're at the maximum resolution, i think you more often benefit from adding more samples especially if you start considering motion blur as AA over time..

I can answer that as I game at 2560x1600.

With hard polygon edges you can get away for the most part with 2xMSAA but some games do require 4xMSAA to completely get rid of all the aliasing.

Transparent objects on the other hand can still require up to 8xTrSSAA to completely get rid of all the jaggies in certain games, even at 2560x1600.
 
There's no such thing as too much antialiasing IMHO. Supersampling should come with a negative LOD offset in analogy to the amount of samples used. With 4x sample SSAA typically -1.0 is used which gives a comparable output to 2xAF in terms of texture sharpness but not more.

The unfortunate thing with today's antialiasing algorithms is that they aren't adaptive as anisotropic algorithms have become in the recent years. Instead of selectively applying as many AA samples any polygon edge/intersection actually needs, algorithms just apply all requested samples to all surfaces irrelevant if they need it or not.

Granted Pixar's rendering is an entirely different chapter of its own, but there's a reason why they use for their work up to 64x sample stochastic AA and temporal AA as well. Else there's no such thing as too much AA. It's mostly the hw and resource restrictions that keep us apart from high sample AA.

1. AA algorithms aren't adaptive.
2. High sample amounts consume a high portion of memory footprint and bandwidth; Multisampling to a far smaller degree than Supersampling, but it still shouldn't mean that its affordable to use 16x sample MSAA even for today's conditions.
3. Despite that the cost scales linearly with increasing sample amounts, the differences are less noticable as sample amounts increase. F.e. you'll notice far easier differences between 2x and 4xAA, than between 8x and 16xAA and no I don't mean static screenshots obviously.

Maybe in the distant future when desktop architectures become in some way more micropolygon oriented more flexible antialiasing algorithms might appear with them.

***edit: and no as already mentioned above antialiasing doesn't disappear with increased resolutions; transparencies are one of those cases that are still noticeable and for the rest (always depending on the display size vs. resolution ratio) jaggies are just getting smaller yet don't disappear at increased resolutions. Imagine Apple uses for its latest iPads at 2048*1536 spread over a 10" screen still font antialiasing. Desktop monitors don't have 4:3 ratios anymore but it isn't that far away from 2560*1600 typically found on 30" desktop these days.
 
The question sometimes arises: whats better - more resolution, or more anti-aliasing at a lower resolution.
Perhaps if you choose that incorrectly then you can be accused of having too much anti-aliasing.

but if you have spare processing power and you're at the maximum resolution, i think you more often benefit from adding more samples especially if you start considering motion blur as AA over time..

More resolution is useless if the PPI doesn't change. Hence 2560x1600 on a 30" monitor is virtually no different from 1600x1200 on a 20" monitor. The jaggies will be the same and AA will be needed regardless. Likewise 2560x1600 on a 30" monitor is only very very slightly better than 1920x1200 on a 24" monitor with regards to pixel density.

Resolution only starts to matter and help with perceived rendering artifacts (aliasing) when resolution increases while monitor size remains constant, IE - increased PPI. And even then it doesn't resolve all rendering artifacts although it may make some of them less noticeable to the naked eye.

Just stating "more resolution" is a rather meaningless statement. :)

Regards,
SB
 
"real" temporal AA means downsampling a framerate so it's a lot more applicable to a CGI movie (and perhaps a real life shooting of a movie or video)
In theory you could probably render a game at 240fps and composite the frames at 60fps, a the cost of some latency. You'd make a demo or a quake 3 hack of this. The real and easier solution is to use a 120Hz display.

120Hz IPS displays, non-3D if be it, are still missing. As for the display bandwith I've calculated that if you can display 3840x2160 60Hz, you have enough for 2560x1600 120Hz.

There's no such thing as too much antialiasing IMHO. Supersampling should come with a negative LOD offset in analogy to the amount of samples used. With 4x sample SSAA typically -1.0 is used which gives a comparable output to 2xAF in terms of texture sharpness but not more.

I used -1.5 with 4x AA and that was the sweet spot (on that particular DX6/glide dual GPU card made in the year 2000 with no AF). -1 seems the perfect theoretical spot and what you use for the guarantee of no added noise.
 
The question sometimes arises: whats better - more resolution, or more anti-aliasing at a lower resolution.
of course depends on comparroson eg 16xAA vs 50% more pixels

but 4x pixels vs 4xAA
Give me the pixels all the way

iphone 3 & 480x320 & 4xAA
vs
iphone 4 & 960x640 & 1xAA

which one you gonna choose? :)
 
"real" temporal AA means downsampling a framerate so it's a lot more applicable to a CGI movie (and perhaps a real life shooting of a movie or video)
In theory you could probably render a game at 240fps and composite the frames at 60fps, a the cost of some latency. You'd make a demo or a quake 3 hack of this. The real and easier solution is to use a 120Hz display.

120Hz IPS displays, non-3D if be it, are still missing. As for the display bandwith I've calculated that if you can display 3840x2160 60Hz, you have enough for 2560x1600 120Hz.

The "temporal AA" modes in older ATI cards weren't aimed at addressing temporal aliasing. All they did was alternate 2x sample patterns every frame with idea being that display persistence would provide you with an effective 4x MSAA rate when the alternating frames were blended together by the display.
 
Yes, I could have made this clear.
It wasn't really great. I'd say it relied on your eyesight to blend the frames as well. A dirty hack. But I read somewhere on this board it's used in at least one member of the FXAA/SMAA/TXAA etc. family. Such as using 2x MSAA + 2x "temporal" + clever processing (aware of what is being done and doing what's good psycho-visually speaking)

of course depends on comparroson eg 16xAA vs 50% more pixels

but 4x pixels vs 4xAA
Give me the pixels all the way

iphone 3 & 480x320 & 4xAA
vs
iphone 4 & 960x640 & 1xAA

which one you gonna choose? :)

960x640 at 4xAA :)
The thing is, contrary to popular wisdom, AA looks only better with a higher resolution. Have good AA and a high resolution and then you'll be playing bullshots.

But as to your choice, the second one is the obvious one giving the higher res display is more versatle. But, lower res with AA can possibly clear some noise you will get with high res and no AA - higher res noise.
When I found myself running for some reason a good 14" CRT from 1992 or 91 and a graphics card that did RGSSAA, I would choose 640x480 with 4xAA over 1024x768 without AA. (the monitor would give me 85Hz rather than 70Hz admittedly, and arbitrarily choosing resolutions is a thing of the past.. As well as GPUs that only support supersampling)
 
Last edited by a moderator:
Agrees with high AA with high resolution...

I play most of my games at 2560x1600 with 12xEdge Detect+12xTrSSAA and the IQ is to die for!!!

I can even manage Crysis at those settings....
 
But as to your choice, the second one is the obvious one giving the higher res display is more versatle. But, lower res with AA can possibly clear some noise you will get with high res and no AA - higher res noise.
When I found myself running for some reason a good 14" CRT from 1992 or 91 and a graphics card that did RGSSAA, I would choose 640x480 with 4xAA over 1024x768 without AA. (the monitor would give me 85Hz rather than 70Hz admittedly, and arbitrarily choosing resolutions is a thing of the past.. As well as GPUs that only support supersampling)

Careful CRTs were quite different animals compared to current monitors; for one aliasing is slightly less apparent on the first.

I'm still using a gigantic 21" CRT for my home desktop; max resolution is 2048*1536*32@75Hz. The trick in that case is that in that resolution it's overriding the mask one one axis and I virtually get 2x oversampling from the monitor for free.

It's clearly a matter of taste but even on a CRT where I still have the choice over a good range of resolutions, I'd still pick the highest possible resolution with the highest possible amount of Multisampling + AF. Dpi is something supersampling cannot replace.

Overall there's no clear answer to such questions, because besides taste there are quite a few variables that play a role.
 
Careful CRTs were quite different animals compared to current monitors; for one aliasing is slightly less apparent on the first.

I'm still using a gigantic 21" CRT for my home desktop; max resolution is 2048*1536*32@75Hz. The trick in that case is that in that resolution it's overriding the mask one one axis and I virtually get 2x oversampling from the monitor for free.

It's clearly a matter of taste but even on a CRT where I still have the choice over a good range of resolutions, I'd still pick the highest possible resolution with the highest possible amount of Multisampling + AF. Dpi is something supersampling cannot replace.

Overall there's no clear answer to such questions, because besides taste there are quite a few variables that play a role.

What monitor? I wish I could get a stupid high resolution CRT monitor.
 
Back
Top