Oh dear, we've gone from one extreme to the next.
First we had the framerate, and it was good... nothing else mattered, a good framerate meant the card was good.
Then reviewers looked down on the framerate, and it was bad. So, along came IQ and it was good... nothing else mattered, a good IQ meant the card was good.
Where's the balance? What good is framerate without the IQ on a high-end card? What good is IQ without framerate? One of the first thing reviewers should do after concluding which AA and AF methods are better, is then push them in games to which which ones are actually playable. If nVidia 8x AF is better than ATI 16x AF, yet isn't usable in-game, that's like having the most sleek plane ever, without giving it the necessary engines for take-off.
The way the article was written, a GeForce 2 could have taken on a GeForce FX in AF, and they would have scored pretty close... never mind the fact that the GeForce FX can push its AF to far higher settings, no, we want a (in such cases) meaningless so-called "apples to apples" comparison. Best setting Vs. Second best setting is not "apples to apples", and you should remember than 8x AF by itself means nothing. Best setting, second best setting... these having meaning. In the end, the true meaning is "Who gives me better theoretical IQ? Who gives me the best playable IQ?"
Let's not lose track of what these cards are used for: gaming. 95% of players will be seeing a game from the view of the player, and it is from THERE that these things should be compared.