Chalnoth said:
With quotes like:
Tomshardware.com said:
As we will see repeatedly throughout our benchmark section, anisotropic filtering is the greatest strength of the new Radeon X800 cards.
You'd think that some reviewers would have bothered to pay attention to how each card applied its anisotropic filtering. Fortunately, one did:
http://www.beyond3d.com/reviews/ati/r420_x800/index.php?p=13#tex
Notice that while the benchmarks around the web are sure to have used nVidia's "highest-quality" setting, a setting which removes the "brilinear," ATI is still using full bilinear filtering on texture stages other than the base texture.
Chalnoth:
What happened? We all have our preferences, and yours happens to be nVidia gathering from your past posts, which is fine. I don't believe I've seen you make quite this inflamitory of a post before though. You are a smart guy, and given the number of posts you've made (compared to my measily 300-400), I'd assume that you knew this was the case with ATI drivers since well before the X800XT was launched. Now, I'm not saying that this is a non-event, but it is certainly not a huge revelation like your thread title makes it out to be.
So anyway, for now I'll ignore the juvenile tone of your post and actually focus on your issue with it. I have mixed feelings about the issue. Ever since the driver developers moved from the "bilinear" and "trilinear" labels to "performance" and "high quality" ones, this kind of stuff has started happening. From that respect it's not exactly lieing nor cheating as they simply are offering a "high quality" setting that may or may not be the highest quality the card can produce. Obviously one goal is to make the high quality setting actually display high quality IQ, but the other is to make it as fast as possible. ATI must have felt disabling trilinear filtering in the other texture stages didn't adversely affect image quality enough to justify it's inclusion.
What is an important distinction for me, is that if a program requests trilinear, the driver does infact deliver trilinear. In this way, ATI is making a distiction between real trilinear being requested, and the "high quality" mode in the driver control panel.
So where does this leave us? Well, ever since the move to the ambiguous labels in the drivers, it's been pretty impossible to benchmark nVidias "quality" settings versus ATI's "quality" settings, as each one takes different short cuts. Neither is really "cheating" per say (in this case) because they never claim to be doing anything specific. It just means that unfortunately reviewers trying to compare the different high quality modes are will have to note that they are doing slightly different things and make note of whether or not the images being produced are comparable.
On the other hand, if the application (or for that matter a user) requests trilinear filtering on all texture stages and doesn't get it, then we have a problem. The same thing is true of rendering shaders, inserting clip planes, or other mischief to try and decieve the user. I don't think the case mentioned in this thread is in the same category.
Nite_Hawk