overclocked_enthusiasm
Regular
I have blown up that image as far as I can (1600% I think) and I CANNOT tell a difference in the two to save my life. The simple fact that ATI is being so forthcoming (albeit late) and giving Dave B. an in-house diagnostic tool leads me to believe that they have NOT degredated IQ for the sake of performance. If this optimization proves to be +/- 2% of true trilinear AND improves performance because of the optimization, aren't we just splitting hairs here?
ATI's responsibility is to give us the highest IQ/performance ratio possible for our money. I am content with this optimization because it appears be done for the right reasons. The motives for Nvidia's application specific optimization was obvious...FUD...and to blatantly cheat benchmark scores. I do NOT see ATI being that stupid and their motives (similar IQ at higher performance) seems to be reasonable and well placed.
At some point we will be forced to compare apples to oranges no matter what we do because of the different proprietary solutions from each company. IQ, performance and other metrics concerning GPU's will continue to be more and more subjective as time goes on.
ATI's responsibility is to give us the highest IQ/performance ratio possible for our money. I am content with this optimization because it appears be done for the right reasons. The motives for Nvidia's application specific optimization was obvious...FUD...and to blatantly cheat benchmark scores. I do NOT see ATI being that stupid and their motives (similar IQ at higher performance) seems to be reasonable and well placed.
At some point we will be forced to compare apples to oranges no matter what we do because of the different proprietary solutions from each company. IQ, performance and other metrics concerning GPU's will continue to be more and more subjective as time goes on.