ATi & nVidia: Differences in Anti-Aliasing?

rigdon

Newcomer
Why is ATi's Anti-Aliasing so much better than nVidia's? I've heard people going as far as saying ATi's 4xAA is equivalent to nVidia's at 8xAA :oops: . What causes this large difference in AA? Is it due to drivers or architecture?

Thanks.
 
It's mostly the sampling pattern. Not every distribution of n samples over the area of a pixel results in the same quality and smoothness of edges.

Gamma correction for the sample blending is another factor.
 
rigdon said:
I've heard people going as far as saying ATi's 4xAA is equivalent to nVidia's at 8xAA :oops: . What causes this large difference in AA? Is it due to drivers or architecture?

It's certainly not equivalent.

ATI's 4xAA has better edge quality at vertical edges.
ATI's AA has better quality at high contrast edges (due to gamma correction).
nVidia's 8xAA could have a better quality corners but you'd need extremely (unrealisticly) small triangles to show that.

nVidia's 8xAA is partially supersampled (its 4xMS x 2xSS) which has it's own pros/cons. It can remove aliasing of edges created by alpha-test (trees in many games are done this way). It can however cause bluriness of 2D screen elements (eg. text).

ATI's 4xAA is also way faster...

rigdon said:
Is it due to drivers or architecture?

It's architecture.
Altough ATI's hw could do nV's AA mode, should they want it.
(Gamma correction can be disabled, samples re-arranged.)
 
Back
Top