trinibwoy said:
Yes your point is well understood but given this forum's premise I would think that FPS would not be the be all and end all of all discussions. If it were a gaming forum or some reseller's review page that's another story. My understanding of filtering methods on 3D hardware has been enhanced by all the press on this ATI trilinear issue. All I am saying is that there is merit in their investigation which you seem to have dismissed completely.
Well, I don't think that fps is the be-all, end-all, certainly...
The point is that defeating performance optimizations which don't degrade IQ is more of a dead-end, get-nowhere-fast kind of thing, imo. Again, there are driver performance optimizations of all kinds which do not degrade IQ, and there are those which do. While I can see the benefit to defeating optimizations which clearly degrade IQ, there doesn't seem to be a benefit to defeating those which increase performance without sacrificing IQ. Defeating non-IQ related performance optimizations seems tantamount to deliberately slowing down performance for no good reason at all--which is senseless, I think. The flaw in some thinking here seems to be that all optimizations
must degrade IQ, so since ATi has admitted to including a trilinear optimization in its drivers there
must be IQ degradation, somewhere, which will become apparent if only we look for it hard enough...
To that end people are looking so hard that they are creating silly 2d "movies" in obscenely low res, which skip frames like crazy, suffer from artifacts internal to the conversion process in which lots of pixels are either dropped or distorted or doubled or halved, and which are limited to the exclusive constraints of the 2d encoding and playback programs they use, all for the strange and bizarre purpose (to me) of attempting to find that which they cannot find in screen shots taken directly from actual gameplay, or actually see while playing the game...
I predict that they will find bunches of artifacts using such methods--but may not realize (or want to realize) the artifacts have been created by their methods as opposed to reflecting artifacts visible in actual gameplay using these products. That's the problem with setting out to prove an assumption not clearly in evidence: the methods and tests and settings used to demonstrate the hypothesis may be manipulated, consciously or otherwise, to reflect the premise instead of the reality. Can you live with that? If you can, fine, but I can't.
Take the original case of the discovery that nVidia had special-case optimized its drivers to defeat trilinear filtering on detail textures in UT2K3, regardless of control-panel or in-game settings made by the user to the contrary, in which the trilinear filtering of detail textures should have occurred, but didn't. It was discovered by way of visually noticing filtering artifacts while playing UT2K3 that shouldn't have been there if detail textures were being filtered correctly; namely, mipmap boundaries which were visible but shouldn't have been. Screen shots were also easily made which proved the premise--it surely was never necessary to make low-res 2d "movies" to demonstrate what was happening, was it?
It was only after the degradation to IQ was observed while playing the game, and clearly evidenced in direct screen shots, that other tests, like the DX rasterizer, were employed to analyze the issue at a deeper level and to further confirm its root cause as being nVidia's trilinear optimization--which originally affected UT2K3 gameplay and benchmarking and nothing else.
But what happened in the present case? Basically, a web site said, "While we can't detect an IQ degradation in playing these games with our naked eyes, or through an examination of screen shots, use of the DX rasterizer shows some pixel differential that in our opinion ought not be there." Then came ATi's statement about its adaptive trilinear filtering method--and next followed all of these "looking for artifacts in a haystack" efforts based wholly on an assumption that because the optimization exists so
must visible artifacts relating to its use also exist. Next, a very interesting statement was made by an unknown M$ employee and was first quoted by Tech Report, and later by THG, plainly stating that the DX rasterizer had
not yet been updated to reflect the capabilities of the new generation of 3d gpu/vpus.
This should have immediately caused the original site to *discard* its assumptions about the current DX rasterizer pixel differentials for R420 on the basis of the fact that if the DX9 rasterizer currently in use by M$ did not accurately reflect the rendering capability of nV40, which M$ said is better than its current DX9 rasterizer reflects, then it
also was not accurately reflecting similar rendering capability for R420, either (this is actually what the unkown M$ employee said in his statement, even though it was made in an nV40-specific context.) But sadly, such was not the case, and objectivity was thrown out of the window in the quest to find The Artifacts Not Actually There, more or less, and the rest is history.
The basis of the theory seems to be this: if an IHV admits to any kind or type of trilinear filtering optimization then it
must be true that visible IQ degradation results. Presumably, the theory is based solely on the fact that IQ degradation was visible when nVidia first used such an optimization for trilinear with nV3x for UT2K3. However, the theory conveniently overlooks the fact that all driver performance optimizations need not produce IQ degradation, and in fact many of them certainly do not; and overlooks the fact that nVidia's own Trilinear optimizations since slipping them into UT2K3 way back then have improved dramatically in terms of visible IQ degradation.
Simply put, it just isn't true that trilinear optimizations
must degrade IQ when used correctly or intelligently (as ATi's algorithms attempt to do.) Way back when all of this started relative to nVidia, Kyle B. of [H] made his now infamous procalmation (paraphrased): "If you can't see the cheating it isn't cheating." The only disagreement I had with KB about that then was that it was because
we could see the cheating that we objected...
I think what Kyle might have meant to say at the time was, "If you are not supposed to see the cheating, it isn't cheating." Heh...
We weren't supposed to take the camera off track in 3dMk03, you see, and if we'd have done what we were supposed to do we'd never have seen the cheating and it wouldn't have been cheating...
Just doesn't quite fly, though, still, does it?...
Anyway, I agree with KB to this extent: if I can't see any visible difference when playing a game between brilinear and trilinear (when I'm playing the game, mind you and not watching a 3d camera on a fixed track), then, yes, I agree with KB that if I can't see the cheating then it isn't cheating, and I'm pleased to accept the performance benefits the optimization provides with no cost to IQ. This is where this issue stands for me at present.