Anti aliasing theory

hi folks, I've heard a theory about AA which says that the more resolution you have, the less AA you'll need, if true, does that mean we are going to reach a certain resolution in the future where AA won't be needed at all?

thanks in advance.
 
It's kind of a silly question. It will improve the quality of any image, regardless of resolution.

The key thing to realize about any image you see with AA and AF is that if you disable them and increase the resolution until performance drops back to the same level, it will look far, far worse than when they were enabled at the lower resolution.
 
It's kind of a silly question. It will improve the quality of any image, regardless of resolution.

The key thing to realize about any image you see with AA and AF is that if you disable them and increase the resolution until performance drops back to the same level, it will look far, far worse than when they were enabled at the lower resolution.

I thought the smaller the pixles get, the less noticable the jaggies are, right?
 
I thought the smaller the pixles get, the less noticable the jaggies are, right?
Yes, but the viewing distance is important. If you sit 50cm far from 15" display, you should sit 1m far from 30" display to use the pixel advantage for removal of aliasing. Many people use larger screens with more pixels to get greater view angle instead.

Anyway human eye can percieve aliasing artifacts even if the rendered resolution tops resolution of the eye. And many modern AA methods are less hardware demanding (and more effective), than rendering in higher resolution :)
 
hi folks, I've heard a theory about AA which says that the more resolution you have, the less AA you'll need, if true, does that mean we are going to reach a certain resolution in the future where AA won't be needed at all?

thanks in advance.

I'm not certain about the perceptive aspects of aliasing, but no: Aliasing won't go away with higher resolution. You're only shifting the cut-off frequency upwards (above which you only get noise - the "crawlies"). Which doesn't help with edges (for which you'd theoretically need infinite sampling frequency to avoid aliasing completely).
 
I'm not certain about the perceptive aspects of aliasing, but no: Aliasing won't go away with higher resolution. You're only shifting the cut-off frequency upwards (above which you only get noise - the "crawlies"). Which doesn't help with edges (for which you'd theoretically need infinite sampling frequency to avoid aliasing completely).

I agree that Aliasing wont go away.. all together BUT the higher the resolution within the same (comparative) size display the far less noticeable aliasing becomes.. generally the higher the resolution (again without increasing the display size) the smaller the pixel pitch, the finer the "displayable details".
 
It is true that aliasing does reduce when going to higher resolutions, but it doesn't mean that it solves every problem.

We can pretty much test this by rendering a high contrast scene without any mipmaps on textures and see if it looks good when printed or displayed with very high resolution. (say 2400dpi)
It should look good when compared to normal resolution, but the fact is that without pre-filtering of textures we get a lot of aliasing even with ridiculous resolutions.
 
As Mintmaster said above, the main point here is that increasing the resolution beyond where pixels are individually visible is similar to adding AA, except 1) it's like supersampling in cost and 2) it's a regular grid, which is much worse than the jittered sampling patterns of MSAA. Thus for equal performance, the higher resolution case will look worse than the higher AA case.
 
if pixels were so small that your eyes have trouble noticing patterns (i.e jaggies) wouldnt jittered sampling provide little benefit?
 
if pixels were so small that your eyes have trouble noticing patterns (i.e jaggies) wouldnt jittered sampling provide little benefit?
I think you can detect patterns to a smaller "resolution" than you can distinguish noise/pixels, so it would still drive that "critical resolution" down a lot and thus be more efficient than uniform sampling at a higher frequency.
 
Quote aboute the resolution of the human eye
"I understand that a
typical person has a maximum resolution of about 17000 point sources per
inch. This doesn't really equate to pixels, but, pixels can be changed into
pixels per inch, and that should be close enough."
 
Back
Top