Russ is making a good point by drawing our attention to the Quack issue and some of the proffered explanations for that. That is, it seems plausible for bugs to be involved. But if we're a bit more careful we can flesh out all the issues:
In both cases, it is obvious that the driver is making application-specific changes to various AF parameters which result in a speed-up over the default rendering path (i.e. that taken when the application recognition is defeated). Of course it should be said that from FM's point of view, any application-specific driver code constitutes an "illicit optimization" (not to be confused with...well, you know), and it is perfectly clear that Nvidia has run afoul of FM's rules from that point of view. But it does not necessarily follow that Nvidia is guilty of "the really bad thing", namely intentionally sacrificing image quality in order to get a speedup in a popular benchmark. Instead it may be plausible that Nvidia coded an application-specific optimization which was meant to cause a speedup with no harm to IQ, but, due to a bug in either hardware or software, actually does cause IQ problems.
The argument why this may have been the case with the Quack issue relies on the notion that the optimization may have been left over from the R100 driver code base, where it caused no IQ problems, but that when carried over to the R200, did cause IQ problems due to a bug or at least a "different way of doing things" in the R200 hardware. For this theory to be true, it would have to be the case that renaming Quake to Quack would cause a preformance loss on R100 hardware (although possibly a smaller performance delta than seen on R200) but have no change in IQ. Unfortunately, it seems the only evidence we have (and are likely to get) is that the Quake-recognition code may have been in R100 drivers, and that R100 at least did not display the same IQ problem that R200 did when running Quake3 with the proper name. As such, the "bug" theory remains a somewhat plausible explanation to a continuing mystery.
Lucky for us, we can test this sort of thing for the 3dMurk controversy. We already know that, just as R100 did not display the mip-map selection problem with Q3, NV2x does not display the AF problem with 3dMark03. What we haven't seen yet (to my knowledge) is whether NV2x nonetheless suffers a performance loss when renaming 3dMark to 3dMurk. If it does, it seems most likely that this problem is indeed the result of an IQ-preserving optimization made for NV2x which, due to some bug in the interaction of the optimization with NV3x hardware, causes IQ problems with NV3x. If, on the other hand, there is no performance difference between 3dMark and 3dMurk on NV2x hardware, it seems clear that the special-casing of "3dMark.exe" is not aimed at NV2x but at NV3x, which makes the conclusion that Nvidia intended an IQ-for-performance trade-off pretty inescapable.
Of course, given the timing of everything, the "inadvertant bug" explanation seems more plausible for Quack than for 3dMurk. That's because Quake3 was out in the R100 timeframe, and the "Quack drivers" were the very first set for R200; whereas 3dMark wasn't released until Nv30 was, and this is Nvidia's 4th or 5th set of NV3x drivers. But that's all circumstancial evidence, and very weak at that.
Personally, I don't believe the "inadvertant bug" explanation for either Quack or 3dMurk. But at least with 3dMurk we can easily test whether that theory is plausible or not.