I just think it is funny that this optimisation has been in the driver since 3.4 and no one has noticed until now. Am I the only one to come to the conclusion that if that really is the case, there is no issue with IQ loss? Sure, let them put an option for Full Trilinear in the control panel, but Nvidia should do the same (currently the option doesn't work last time I check). I don't think ATI will have an issue with that. Bit-wise comparisions do no good for comparing IQ so we need better ways to judge, since currently it is all subjective.
One other thing is why should ATI have to explain every optimization in their driver? Do games explain every optimization from their code that they put in since they started writing code for the game? I think not.