#1 the 8500 images look more aliased on the far left/right.
#2 Why is the standing grass on the right missing in the 8500 screen?
#3 These tests are invalid until we see them compared in motion
#4 3D graphics is all about the art of cheating. Most people can't tell the subtle difference. While ATI's algorithm may be "less correct", no one is going to tell the difference, I bet, in the vast majority of game situations.
People on this board agonize over blown up still shots of various 3D scenes trying to argue over small pixel areas and translate those into comments like "WAY SUPERIOR" The average person, if seeing these two games running side by side on two monitors, would not experience a "night and day" effect, in fact, they may be hard pressed to tell the difference at all.
Obviously, the rasterization engines should adapt their texture filtering and AA sampling to those areas which need them most, and they should endeavor to be "correct" most of the time and avoid artifacts by making the logic smart.
My opinion is that there isn't a big enough difference to worry about it. Any form of anisotropic is better than plain old bi/trilinear. Score 1 for ATI. On the other hand, ATI doesn't do multisampling and doesn't adjust AA to where it is needed most. Score 1 for NVidia.
Obviously, the ideal next-gen card would do the optimal thing for both texture filtering and FSAA. Let's have 128-tap filtering and 16X AA on those texels and pixels which need them most and in the optimal pattern for that particular triangle, and that particular texture map.