Hanners said:
What I find kind of bizarre is that [H] still don't seem to understand why ATi had a gripe with their benchmarking in the first place, despite the marked performance differences we've seen in UT2003 with application detection (and thus the forced bilinear filtering) disabled.
It would have been wonderful to see [H] being brave enough to give Antidetector a shot to show the difference, but I guess they are too scared of external influences to do anything like that.
I don't think fear has anything to do with it--more like a love affair with ignorance that [H] seems determined to carry on, no matter what. His detractors he calls "anal-technical" simply because they converse on general topics he cannot grasp. I am literally stunned that in 2003 we have "major" hardware sites saying that the difference between Tri and Bi-linear filtering isn't important because in certain screen shots you "can't see the difference" (while they studiously ignore the screen shots in which a difference can be seen--such as in the UT2K3 color-code filtering tests designed specifically to show the differences between filtering types such as bi and tri-linear.) Amazing and selective hypocrisy--not to mention the fact that in some cases the mip-map boundaries are visible while playing the game--a fact which [H] declares to be unimportant.
He basically says it's OK to compare tri-linear filtering on an ATi product with bilinear filtering on an GFFX product because
in his opinion the IQ differences aren't noticeable while playing the game. So, OK, then, to avoid the hypocrisy the proper thing would be to compare the ATi performance with the GFFX performance while both products are running bi-linear, and to simply concede the GF FX has been deliberately crippled by nVidia so as not to provide Tri-linear filtering of detail textures in any case at all.
But no, it seems if [H] cannot declare a winner in a deliberately stacked contest [Ati's tri-linear versus GFFX's bi-linear] it isn't interested in declaring a winner at all. Like it or not folks the record at [H] post nv30 has been one of consistent apology for nVidia, regardless of how obviously and flagrantly it must skew its reporting.
Kyle is simply a nincompoop of the first order if he thinks nVidia's elimination of detail-texture trilinear support in its drivers is a non-issue. Lots of people care about issues such as these.
I said it in an earlier post: this is unfortunately boiling down to a classic struggle between the haves and the have nots; the generally educated vs. the unwashed ignorant. Basically, the boys at [H] have simply shut off the old brain cells and have decided to substitute whatever nVidia says and does for rational thought.
Who but nVidia might ever say or suggest "Oh, come now. Trilinear filtering isn't really important, is it?" And who but [H] it seems might ever officially respond in the affirmative to such an idiotic question? The hole just gets deeper and deeper--and only those digging it seem oblivious to it. Remarkable.