grrrpoop said:
Personally I don't see how you can have a "performance" driver which doesn't sacrifice something to achieve that performance, and with nVidia's tendancy to ditch IQ at the drop of a hat recently, galperi1's conclusions are not unreasonable imo..
They could set whatever they want in-game (and I believe they say max settings), but if they've chosen "balanced" in the drivers then they're not doing trilinear.
I don't know why they put that comment there. Maybe because it's one of those '3DMark optimized' drivers (it's really the only reason I can see). On the other hand, this driver supports 8xS and 16x antialiasing in OpenGL, which do not really fit the "performance" label.
Yes, it is very likely that they used 'balanced' setting (which, btw, is renamed to 'quality' in those new drivers). It's default, I think.
Maybe someone who has a GFFX card could test how much of a difference there is in Q3 between application and balanced?