Many times I've seen mention of ATI's 4xAA being superior to Nvidia's 8xAA (or some similar claim). Is there any method of measuring if in fact this is the case, that (for instance) the Nvidia drivers are simply lying to the user about what level of AA is actually being displayed, or are using an inferior algorithm that doesn't qualify (to some accepted standard) as the reported level?
How do you measure AA? It seems to me that there is unlimited room for videocard manufacturers to slap something on the screen and call it AA or FSAA of whatever level, if there is no way to verify their claims?
rms
rsquires@flash.net
How do you measure AA? It seems to me that there is unlimited room for videocard manufacturers to slap something on the screen and call it AA or FSAA of whatever level, if there is no way to verify their claims?
rms
rsquires@flash.net