Adjusting texture format from uncompressed to compressed can in cases be really really bad.
I wasn't trying to argue that it's bad. I personally think it's a great thing (assuming the textures are actually analysed for quality beforehand). I've always enabled AI on my ATi cards. Performance increase with little or no visible differences, great.
I wasn't even trying to argue that ATi is the only one who does this (although the ATi fanboys here ofcourse assumed that for trolling's sake)...
I was just pointing out that there are documented cases of this happening. One of the first occurences was with 3dmark2000 I believe... we noticed that some videocards scored better on the fillrate tests than their theoretical specs allowed... These videocards introduced texture compression... Further investigation pointed out that the driver forced texture compression on during the fillrate tests.
Bottom line is just that if you design a benchmark where you assume a certain pixelformat, and the driver does something else, your bandwidth calculations will be inflated, and won't represent the actual hardware capabilities.