DaveH: NVIDIA 44.03 Texture Filtering

Dave Baumann

Gamerscore Wh...
Moderator
Legend
In the other thread you mentioned about NVIDIA "explicitly" stating Quality = Trilinear (or "Application without Debug"). I've not gone through the recording yet (there was, afterall, about 4 hours worth!), however here is a snip from the reviewers guide. The reviewers guide itself doesn't expliciatly state that Quality is Trilinear, but it does display the following:

guide.jpg


As you can see the guide advocates the use of SamX's app, downloaded from here no less, to display the quality of tha AF filtering. In that shot they have the full AF quality, with full Trilinear enabled.

Following the guide a number of reviewers (Lars at Toms) diligently downloaded the app and showed all the quality options available - clearly demonstrating full Trilinear in all modes when AF was enabled. Here, for example is the OpenGL app running in Quality with the 44.03:

ogl.jpg


Now, for interest, there is also a D3D version, this is the output of the D3D version:

d3d.jpg


As we'd expect, given their guide: full Trilinear. Curiously look what happens when the driver control panels are kep the same but the exe of the d3d version is renamed to "ut2003.exe":

ut2003.jpg


(I kid you not, please try this yourselves) Not actually that much of a surprise, since when knew they were dropping down the filtering levels. However, clearly they are advocating the use of this application to demonstrate the filtering quality and then doing something else when the "ut2003.exe" was detected.

Interestingly, lets take a look at what it does to the Aniso quality. First is just the standard app name with Quality 8X AF. I can;t use the full colours because curiously the app flicks back to 1X AF:

d3d8x.jpg


This is the same settings at name "ut2003.exe":

ut20038x.jpg


Though its difficult to see it appears that there is still the Bi/Trilinear mix there, however they haven't altered the filter kernel shape which is quite interesting.
 
I don't know what to say except Nvidia still manages to surprise me. With all the crap that has been going on, you'd expect that they would at least use a slightly less inept method of application detection. It's like they want to get caught.
 
Dave,
your "ut2003.exe" AF test picture looks very much like my 9600pro if I turn down the texture quality to Performance (2 notches down) at stage 0. What happens with the other stages in the case of your GFFX?
With the setting related above on my 9600pro the following stages are close to, but not quite, bilinear.

Just wondering.
 
your "ut2003.exe" AF test picture looks very much like my 9600pro if I turn down the texture quality to Performance (2 notches down) at stage 0. What happens with the other stages in the case of your GFFX?

It goes very close to Bilinear. Thats what I always found a little odd about the 44.03's - the left and middle settings both appeared to be near Bilinear and the right setting (Quality) goes to full Trilinear. What UT2003 appear to be getting is what the middle setting used to be under the earlier drivers.
 
To a certain extent you can understand why they are doing it when AF is applied, since most reviewers will be using ATI's control panel, which is only Trilinear filtering at stage 0. However, in non-AF circumstances this really is an 'apples-to-oranges' comparison.

Now, if only we could get that control panel option for application control of AF in UT2003. Mr Vogel, fancy putting one in the next patch? (and potentially making this AF contol a command line option as well?)
 
While I would think they might mention it elsewhere, it bears noting that in the excerpt you posted, Nvidia doesn't even mention bilinear vs. trilinear, only their AF kernel. (Incidentally, I'm sure you're not supposed to post the whole thing, but that reviewer's guide looks interesting. It's always fun to see the sleaze Nvidia tries to smear around.)

(On second thought, I cringe at the thought of the forum traffic that would create.)

As for their method of app detection, it would be nice if we could push for some sort of gentlemens' agreement among the IHVs that all app detection would only be triggered by app name. (Or, even better, selectable in the registry or some such.) I think we'd all be happier with app detection if we could be somewhat confident that it was easy to catch and disable. While publishing details of what exactly is being done would certainly be the best result (in release notes with new driver versions, say), with the anti-professionalism of the 3d hardware industry we all know that's not going to happen. Making app detection transparent and defeatable would be a great first step.

Um...just thought of something. If the app detection is based on the executable name...does renaming it to u[T]2003.exe (or whathaveyou, that was the best I could do on short notice) re-enable full trilinear?? :?:
 
Its not non-trivial to rename the exe with UT2003 since the name is reference in a bunch of logs/ini's and there is another file somewhere that needs to be renamed. Someone once mentioned how to do it but I've forgotten now.

Given the AMDMB article though, and the differences with AF there may be other detections going on than just the .exe name, as with 3DMark03.
 
Dave B, your new find made it into the Nvidia chat. :D
<SwedBear> FWIW - Dave Bauman over at Beyond3D has investigated the UT2k3 issue and found that when you rename a test application to ut2k3.exe it goes bilinera.r You should check the threads out. (no Q)
 
Dave, what I think you're failing to take into consideration is that by renaming the application you are possibly disabling a very valid application specific detection for that benchmark that nVidia put in to fix a bug.....







....nah, it's a smoking gun. Nail 'em to a cross! :devilish:
 
OMFG! Just when I think things cant get any worse for Nvidia... *rofl*
 
"Interesting..." ??

The wods that come to my mind are more like: sad, outrageous and pathetic. Of course these are merely subjective reactions and are not indicative of actual gameplay.

I especially like the part where [T] plays it both ways and says first,

"UT2K3 is a first person shooter. You run around real fast fragging people, you don’t exactly stand around to smell the roses. While you are doing this constant running around you are concentrating on so many other things that texture filtering between mipmaps is the least of your worries."

So of course it doesn't matter then that this optimization is done without your control because you won't notice it. Although...

In a thread quote I can't find now Kyle stated that he would have prefered if this was done with user control. (I won't go to [T] for the original.)

And...

"So for summing up which card is ‘best’ at image quality in UT2003 the 9800 Pro still stands as that card. The sharper texture quality with AF enabled plus its better AA quality combined gives the 9800 Pro its image quality edge in this game"

And....(as posted by Scorched at R3D)

"Don't worry. He compensated by putting "[H]arder Than Trilinear Filtering on GFFX" as the latest [H] description on the front page earlier today. No kidding."

Kyle just loves playing it ALL ways, that way you can take whatever you want to from his "position". Can someone say future congressman material?

Caps

Nvidia and [T] optimizing the future in ways you won't notice OR from which we can easily weasle. POOF! All is now AOK!
 
nelg said:
Dave B, your new find made it into the Nvidia chat. :D
<SwedBear> FWIW - Dave Bauman over at Beyond3D has investigated the UT2k3 issue and found that when you rename a test application to ut2k3.exe it goes bilinera.r You should check the threads out. (no Q)

And of course, nVidia completely ignored the point and moved on to other questions. Speaks volumes.
 
I thought I had some basic understanding of AF, but maybe I don´t, so take this for it´s worth (if anything).
It occurred to me that Dave´s “tintedâ€￾ AF pics looks just about exactly as my 9600pro with 2xAF and LOD bias -1,5 (actually the 9600 looks a little better) using the "tinted" setting in D3D AF tester.

So I did these tests: at 2xAF in the driver panel and mip bias at -2 (in ut2003.ini) my screenshots revealed that these settings actually produced pictures of higher quality than setting 16xAF and mip bias to 0.

I also ran some benchmarks with the citadel-flyby with these settings. At 2x/-2 the score was 135 fps, at 16x/o the score was 115. With 4xAA added the score at 2x/-2 was 108 fps, and at 16x/0 it was 90,

It seems clear that low AF setting together with aggressive LOD gives at least as good IQ while giving clearly higher scores.


I don´t know if this is in any way related to what nVidia is doing, but I thought someone might find it interesting.

To me the interesting thing is the effect on IQ and that Ati:s AF “starâ€￾ is almost perfectly circular under these settings.

rubank
 
Back
Top