Sharkfood said:Because the 8500 doesnt make performance compromises with LOD Bias like the NVIDIA cards do. We already had this discussion earlier. I took shots at default LOD Bias for both cards, but this can, of course, be reduced down.
What? Texture aliasing is a rendering artifact that should be avoided. The Radeon 8500 (like the Radeon before it...) quite obviously uses too aggressive MIP LOD settings. This will usually result in superior screenshots, but there will also be situations in-game that will produce horrid texture aliasing.
The shots your provided show the EXACT same thing- little to no change by doubling the anisotropic filtering samples.
There is a change. This proves beyond a shadow of a doubt that your statement that there is no change between 4x aniso and 8x aniso was false. That you continue to argue the point baffles me.
Especially given how the GF3 in early drivers, the 8500 and SGI/USparc graphic workstations do not exhibit this "GF4-only" behavior. Just the GF4.
Try showing this, not just stating it.
You really need to try an 8500. From your commentary it's plainly obvious you have *never* used, seen or operated an 8500. This much is clear. And from that basis, I discount your opinions as such from someone that has zero experience, and this rings in more and more true with every post you place.
No, I have not used a Radeon 8500, and I don't intend to. Unless, perhaps, I happen to be near somebody who owns one (which hasn't happened yet...). This is, in fact, why I am here (Well, half the reason...I also like to argue ). I don't want to have to try out every piece of hardware in existence to get an idea of which hardware I would like best. For example, if I wanted increased texture detail at the expense of texture aliasing, I would adjust the texture LOD on my GeForce4.