Chat Transcript: ATI's texture filtering algorithms

CyanBlues said:
…its a cheat cuz they tried to hide it….

Because they didn’t file a report to let you know about their new Adaptive-Trilinear it’s a cheat ….??? ….. Like I say …ridiculous. Are ATI suppose to file a report with us every time they develop a new technology?

ATI hasn’t tried to deceive anyone about anything. And ATI didn’t try to hide it, it looks like they just didn’t want tell anyone about it. And I can see why now for good reason . ATI has developed a great new Adaptive-Trilinear technology that works as good or better than the old Trilinear and improves performance at the same time. Why would they want to blab about it and let their competitors know about it?. The likely reason ATI never told anyone about this new Adaptive-Trilinear technology is for competitive reasons.


l…

… We encourage users to experiment with moving the texture preference slider from “Quality” towards "Performance" – you will see huge performance gains with no effect on image quality until the very end, and even then, the effect is hardly noticeable. We are confident that we give gamers the best image quality at every performance level. …

If the mipmap transitions are barely noticeable with the control panel set all the way to performance -- this is amazing. Sounds like that Adaptive-Trilinear is really good. The performance setting on the X800 may be as good or better than the Brilinear on the 6800. Maybe we should be benchmarking with ATI X800 cards set to all the way to performance against the 6800’s Brilinear.
 
Maybe we should be benchmarking with ATI X800 cards set to all the way to performance against the 6800’s Brilinear.

NVIDIA's current settings are not very bad at all - they have considerably improved since their initial introduction.
 
FUDie said:
Mephisto said:
Our algorithms are exercised by the stringent MS WHQL tests for mipmap filtering and trilinear and we pass all these tests. These tests do not look for exact bit matches but have a reasonable tolerance for algorithmic and numeric variance.
One more lie. Users here have proved that WHQL doesn't test trilinear filter quality at all.
Is that so? :LOL:

-FUDie
Nevertheless, it seems to be true as christoph showed:
http://www.beyond3d.com/forum/viewtopic.php?p=286266&highlight=#286266
 
DaveBaumann said:
Maybe we should be benchmarking with ATI X800 cards set to all the way to performance against the 6800’s Brilinear.

NVIDIA's current settings are not very bad at all - they have considerably improved since their initial introduction.

Yes, actually it is interesting with color-mips on it looks like the bi/tri-ratio has increased but in real in-game situations it looks a lot better than when it was introduced. Maybe nVidia has some kind of adaptive algorithm too.
 
Tim said:
Yes, actually it is interesting with color-mips on it looks like the bi/tri-ratio has increased but in real in-game situations it looks a lot better than when it was introduced. Maybe nVidia has some kind of adaptive algorithm too.

Well, I'm trying to get a reply to that question, but my initial reaction is that they probably aren't given whats displayed by the coloured mip maps. What they could be doing is setting a default level and then doing some more dine tuning at an application level, there's no generic logic behind it. I think what is clear is the initial implementation they used was too agressive with the areas that had trilinear blend and pure bilinear; thats been dialed back somewhat now so the transistions aren't as noticble as they once were.
 
Bouncing Zabaglione Bros. said:
NutSack said:
Interesting, looks like they dodged anonymouscoward's question a little bit when he asked the straight forward question of "is it the same".

The answer is "sometimes it does, and sometimes it doesn't". The algorithm uses full trilinear when it decides the textures warrent it. When the textures don't need it, levels drop down towards bilinear.
I'm not trying to be an ass :rolleyes:, but wouldn't if it only does it part of the time make it different from full trilinear in benchmarking, for instance...Average FPS. I agree with the second part completely, thanks for a good comment.
 
NutSack said:
I'm not trying to be an ass :rolleyes:, but wouldn't if it only does it part of the time make it different from full trilinear in benchmarking, for instance...Average FPS. I agree with the second part completely, thanks for a good comment.


Yes, it does give a different workload on the graphics card compared to doing old-trilinear... But as has been said before, filtering isn't controlled by the benchmark, but by the users. Thus it is different from replacing shaders and adding clip-planes to reduce workload. (Allthough an option to choose when to use this optimisation would be nice).

And no matter what we do to get as close to an apple-to-apple comparison, it just isn't possible since each IHV is using different algorithms. At best we get aproximations of what is equal and with that in mind I don't see any problem with this trylinear (apart from lack of checkbox).
 
I think that some people though not too many in this thread are coming off as hypocrites. If IQ is the same you would consider it an optimization that is what you are saying. However if FP16 replaces FP24 (32) on an NV card and the output is identical which it many times was that would imply it is simply an optimization. The point being that many complained when NV did similar things to this.

Now the difference as I see it is that supposedly ATI has a general algorithm that decides what to do, and NV had people specifically go in and replace stuff and I do see that as an important distinction. However if NV's compiler actually made the decision and even fubared it sometimes people would still complain.

Another contentious point is that the IQ comparisons many sites do rely on throwing colored mipmaps up and then going look how swell it looks, unfortunately since ATI does full trilinear with colored mipmaps that makes the IQ look better than it otherwise would, this alone is very very disturbing and whether it is truly an accident has yet to be determined.
 
NutSack said:
I'm not trying to be an ass :rolleyes:, but wouldn't if it only does it part of the time make it different from full trilinear in benchmarking, for instance...Average FPS. I agree with the second part completely, thanks for a good comment.

Yes it would - but that's the point. It's supposed to be faster where it can be, without sacrificing IQ. It should bench faster in these circumstances, because it is generically faster. It's a different form of adaptive filtering, where we are used to a one-size fits all bilinear or trilinear with much more rigid rules. Alternatively we have Nvidia Briliner that is more fixed, and seems to noticabley lose IQ more often because it isn't adaptive in nature.

The reason why we're seeing full trilinear in coloured mipmaps (and an associated performance dip) is that coloured mipmaps is just the kind of case where the algorithm decides that it needs full trilinear.
 
Sxotty said:
Another contentious point is that the IQ comparisons many sites do rely on throwing colored mipmaps up and then going look how swell it looks, unfortunately since ATI does full trilinear with colored mipmaps that makes the IQ look better than it otherwise would, this alone is very very disturbing and whether it is truly an accident has yet to be determined.
So are you suggesting that ATI should have deliberately broken their algorithm for the colored mipmap case, so that it gave worse image quality? What on earth would be the point of that?
 
Evildeus said:
FUDie said:
Mephisto said:
Our algorithms are exercised by the stringent MS WHQL tests for mipmap filtering and trilinear and we pass all these tests. These tests do not look for exact bit matches but have a reasonable tolerance for algorithmic and numeric variance.
One more lie. Users here have proved that WHQL doesn't test trilinear filter quality at all.
Is that so? :LOL:
Nevertheless, it seems to be true as christoph showed:
http://www.beyond3d.com/forum/viewtopic.php?p=286266&highlight=#286266
Read that however you want, but I know there is a trilinear test as I have seen it myself. The texture filter test runs through all combinations of exposed min/mag/mip filters with all exposed texture formats.

-FUDie
 
Blastman said:
ATI hasn’t tried to deceive anyone about anything. And ATI didn’t try to hide it, it looks like they just didn’t want tell anyone about it. And I can see why now for good reason . ATI has developed a great new Adaptive-Trilinear technology that works as good or better than the old Trilinear and improves performance at the same time. Why would they want to blab about it and let their competitors know about it?. The likely reason ATI never told anyone about this new Adaptive-Trilinear technology is for competitive reasons.

Nvidia hasn’t tried to deceive anyone about anything. And Nvidia didn’t try to hide it, it looks like they just didn’t want tell anyone about it. And I can see why now for good reason . Nvidia has developed a great new Brilinear technology that works as good (with neglible IQ difference) as the old Trilinear and improves performance at the same time. Why would they want to blab about it and let their competitors know about it?. The likely reason Nvidia never told anyone about this new Brilinear technology is for competitive reasons. :rolleyes:
 
Nvidia hasn’t tried to deceive anyone about anything. And Nvidia didn’t try to hide it, it looks like they just didn’t want tell anyone about it. And I can see why now for good reason . Nvidia has developed a great new Brilinear technology that works as good (with significant IQ difference) as the old Trilinear and improves performance at the same time. Why would they want to blab about it and let their competitors know about it?. The likely reason Nvidia never told anyone about this new Brilinear technology is for competitive reasons. :rolleyes:

Shall we continue this or stop here CyanBlues?
 
FUDie said:
Read that however you want, but I know there is a trilinear test as I have seen it myself. The texture filter test runs through all combinations of exposed min/mag/mip filters with all exposed texture formats.

-FUDie
That doesn't mean it's required ;) Brilinear?
 
I am impressed with ATI's attitude about the whole subject. Nvidia didn't bring in senior engineers to account for optimizations in their hardware/drivers.

I can see ATI's point. Just because you use a different algorithm to eliminate unnecessary work doesn't mean you can't produce the same or better quality output.
 
rwolf said:
I am impressed with ATI's attitude about the whole subject. Nvidia didn't bring in senior engineers to account for optimizations in their hardware/drivers.
True dat, and it is a major difference.
 
Evildeus said:
FUDie said:
Read that however you want, but I know there is a trilinear test as I have seen it myself. The texture filter test runs through all combinations of exposed min/mag/mip filters with all exposed texture formats.
That doesn't mean it's required ;) Brilinear?
Get a clue. If you expose the cap, you must pass the test.

-FUDie
 
Tahir said:
Nvidia hasn’t tried to deceive anyone about anything. And Nvidia didn’t try to hide it, it looks like they just didn’t want tell anyone about it. And I can see why now for good reason . Nvidia has developed a great new Brilinear technology that works as good (with significant IQ difference) as the old Trilinear and improves performance at the same time. Why would they want to blab about it and let their competitors know about it?. The likely reason Nvidia never told anyone about this new Brilinear technology is for competitive reasons. :rolleyes:

Shall we continue this or stop here CyanBlues?

significant to who? some people say they can't tell the difference, just cuz its significant to you doesnt mean it is to them. what ever happen to the argument that nvidia cheats cuz you cant disable their brilinear, and now its all okay? i dont know about you but when i get my x800xt i would still like the option to have full tri.
 
Back
Top