Bolloxoid said:
I'll just quote 3DGPU quoting Brian Burke:
I spoke with Brian Burke, PR Director for Desktop Products at NVIDIA regarding the UT2003 situation as was told that the drop in trilinear filtering levels does not degrade IQ, and therefore does not violate their new optimization methods.
So, their interpretation of the "leaked" optimisation guidelines is pretty flexible.
Link to the whole editorial:
http://www.3dgpu.com/modules/wfsection/article.php?articleid=70
Wow. I cannot contain my amazement and stupefaction....Every time I think I think we might be on the verge of a reformation...the guys at nVidia PR bring me solidly back to earth.
Assuming this is an accurate representation of what BB said (I can't figure out why the guys who write these pieces and call these people on the phone don't get direct quotes), it would seem to directly contradict an answer given in a recent DriverHeaven interview with Derek Perez:
Zardon: Will you ever re-add the option to force quality (trilinear etc) settings for AA/AF?
Derek: Yes we will – we hope to do so in a future driver rev.
http://www.driverheaven.net/dhinterviews/nvidia/
Sounds like the same issue to me...Admittedly, there's "weasel room" in that DP could always say "I wasn't talking about UT2K3".....but....come on...
If that's the answer frgmstr was given--that disabling full trilinear doesn't affect image quality--then no wonder he blew his top. Sadly, I'm afraid, they'll probably reference frgmstr's UT2K3 filtering article as a "reference" to prove their point--so I hope he stays a step ahead and deletes it.
I have to say that I'm disappointed, but hardly suprised, actually. Even so, public displays of incredible stupidity do shock me, even if they're anticipated. nVidia seems to have a real knack for doing it though, if you know what I mean.
In the DH interview, Perez also lays out the "guidelines," as well. But if Burke is now saying those guidelines may only be interpreted and defined by nVidia then it almost appears as if they were never anything more more than a wholesale fabrication of nVidia PR. There's not a software engineer alive who makes his living writing 3d-card drivers who would be caught dead saying trilinear filtering doesn't make an image quality difference--heh. Certainly not at nVidia--a company which has been enabling and supporting full trilinear in its products for years precisely because of the benefits it brings to IQ. Remarkable.
The very best nVidia can hope for here is that Burke's comments were badly mangled in the retelling.
Edit:
I'm sure nVidia will "luv" this now that it has ponied up some fees and rejoined the FM Program:
3dgpu said:
We will not use any synthetic benchmarks to test performance.
That means 3DMark, CodeCreatures, ShaderMark, etc. Since there is no way for us to be sure of the validity of the tests, we will leave them by the wayside. We don’t play synthetic benchmarks anyway. We may use programs to compare IQ, like FSAATester.
Personally, I think this is tossing out the baby with the bathwater. It's a cryin' shame that FM didn't stick to its original audit report, but then again, FM's just digging it's own grave here. IMO, there is absolutely nothing inherently "wrong" with synthetic benchmarks. With or without nVidia's co-operation, I think 3dMk03 is probably the most impressive synthetic benchmark for a 3d-card to date. It's too bad that FM can only measure the quality of its software by the number of paying IHV's in its program--really too bad. But it's their company, not mine.
*sigh* Synthetic benchmarks, like individual 3d games with differing engines, will return differing performance results, even on the same hardware. Might as well say that all 3d games are bad standards to use because the results differ among them, as to say all synthetics are bad for the same reason. The point to benchmarks and games is that you use *a lot of them* to get an aggregate picture of 3d hardware. It's just as wrong to look only at Doom3 performance as it is to look only at 3dMk03 performance--wrong, because either method is woefully incomplete. But that's not to say you should avoid looking at either as a part of a complete picture. My, my...what have we come to...