Ways to solve complaining about ATI's filtering?

I think a bit of "Criminal Justice 101" is in order. For a person to be found guilty of a crime (at least in the US...), INTENT must be proven. In this case, ATI have been accused of "cheating" and it looks like ATI is guilty as charged. What people seem to be getting confused over is the context in which ATI is guilty.

The case:

For the majority of the consumers who ATI is targeting, framerate = everything. The marketing people at ATI are not stupid and know this and also know that the majority of potential customers rely on benchmarks done by review sites to make their decision.

Did ATI design and implement an optomization in their product without telling anyone? YES

Did ATI tell reviewers that while performing benchmarks, competitor products MUST be set to full trilinear? YES

Did this change skew the results of said benchmarks? YES

Conclusion:

In the context of the benchmark arena, ATI is guilty of "fixing" parameters with the intent of maximizing the delta between their product and the product of their competitor(s). ATI is guilty as sin. GUILTY, GUILTY, GUILTY :!: :!: THEY SHOULD ALL BE TARRED AND FEATHERED IN THE MIDDLE OF THE FORUMS FOR TAKING A "VOODOO MINDSET" :!:

:idea: Ok, back to the real world. Does anyone really give a shit? Is this "smartfiltering" a bad thing? NO! In fact, the opposite is true. This is a beautiful solution which serves to maximize the potential of the product while satisfying the wants of the consumer. I feel sorry for the dev who thought this up and has been forced to hide in some deep cave under the ATI home office for the last year rather than receive the credit he is due. Someone find ATI's "Count of Monte Cristo", free him from his imprisonment, lead him out into the world of sunlight, and then give him a cookie... :D
 
Stryyder said:
bGeek said:
Stryyder said:
DaveBaumann said:
While this remains as the defaul for ATI then just compare with Tri optimisations for the both.

Noting the IQ differences of course I am assuming.

digitalwanderer said:
DaveBaumann said:
If there are any.
If there are any, I'm starting to think that the image quality is comprable if not exact to trilinear with this new method....I'm going a bit nuts just trying to be able to tell the difference and if I DO start to notice a difference I still can't decide if it's better or worse looking, it's too damned close.

I actually took Dave's comment to mean any difference between Nvidia's Brilinear and ATI's Trylinear.

Unfortunately I took it the same way..... So is ATI adaptive trilinear on par with Brilinear or legacy Trilinear??

Why do I see reviews going from 30 pages longe to 50 in the near future.

Well, they do say, "Beauty is in the eye of the beholder." :D As long as both (Trylinear and Brilinear) are consider being accurate representations of "Legacy" Trilinear, I can see no harm in that. :)

Only harm done was not mentioning the options for "Legacy" Tilinear Filtering will not be a possibility or concern any longer. :D
 
I agree with Dave that they both should just be benched with the optimizations on. Even though I correctly identified the true trilinear filtering in Dave's blind test, I just can't see a difference in any of the screen shots posted so far. Even though NV's optimizations look worse in filtering tests, it's not readily apparent in some of the NV screenshots posted either.
Ideally I would like to see ATI put in a checkbox for full trilinear filtering. That way reviewers could run the tests both ways on each card with comments on image quality diff. and provide screens of both modes for each card.
 
You know what I think honestly. I think ATI developed this solution tested it onthe 9600 saw no one noticed put it on the x800 and didn't tell anyone, that way everyone would think ATI was better at trilinear filtering than NV then when they finally get caught with their pants down, instead of trying to get out of it by saying it was a bug or a mistake, they simply say hey wearing your pants around your knees is now in style so deal with it . :D

They then go into detail to describe just how they wear their pants and show they really planned it carefully and the belt is just so, so that they wont fall off and trip you up, really it is the hip new way :)
 
The Dig unbuttons his pants and joyously lets them fall down around his ankles!

I knew this day would finally come, I knewdled it! 8)
 
MrGaribaldi said:
But DaveB has gotten a copy of a program which shows legacy-trilinear and trylinear filtering, and the blind test said that most of us couldn't see any diff...
I disagree with the bold part.
 
Evildeus said:
MrGaribaldi said:
But DaveB has gotten a copy of a program which shows legacy-trilinear and trylinear filtering, and the blind test said that most of us couldn't see any diff...
I disagree with the bold part.

Ok, that is your choice, but why?
Currently the poll is at 46% "Can't Tell"/"see diff, but can't decide which is correct", with 39% saying"Right" and 8% saying "Left"
And when I posted the post you quoted, the % for "Can't Tell" was even higher (51% I believe).

Thus I think that the majority still has trouble determining which is correct.
Alas the answer has been posted in the thread, so the results are getting "tainted" since we can not determine if the vote was based on what one saw or what one read in the thread.

So if you'd care to elaborate why you disagree I'd appreciate it :)
 
I also agree. I was at home when I took the test using a fairly high end CRT monitor (22" Mitsubishi DP 2070SB) and it did not take me long to see which one was filtering better.
I am not too concerned about it. The image quality is still excellent. If he were to post actual screenshots in those different modes I would not be able to tell the difference. It's just that the synthetic test tends to highlight any differences.

Edit: I thinks part of the problem is that some of the people did not know what they were supposed to be looking for.
 
[quote="Stryyder]
Unfortunately I took it the same way..... So is ATI adaptive trilinear on par with Brilinear or legacy Trilinear??

Why do I see reviews going from 30 pages longe to 50 in the near future.[/quote]

Keep in mind that Dave has said many times the past couple of days that Nvidia's method has imporved by leaps and bounds over the last year. I really did take it to mean

Nvidia ~= ATI ~= Full Trilinear.

I 'm sure there are some differences but this is what I gather from Dave's posts the last couple of days.
 
The fact that only 50% (less now) can't find a difference should give you the answer. If the quality was the same, i would say that at least 80% of the voters wouldn't see it.
Note that ~3/4 of the other voters did find the "good" answer.

MrGaribaldi said:
Evildeus said:
MrGaribaldi said:
But DaveB has gotten a copy of a program which shows legacy-trilinear and trylinear filtering, and the blind test said that most of us couldn't see any diff...
I disagree with the bold part.

Ok, that is your choice, but why?
Currently the poll is at 46% "Can't Tell"/"see diff, but can't decide which is correct", with 39% saying"Right" and 8% saying "Left"
And when I posted the post you quoted, the % for "Can't Tell" was even higher (51% I believe).

Thus I think that the majority still has trouble determining which is correct.
Alas the answer has been posted in the thread, so the results are getting "tainted" since we can not determine if the vote was based on what one saw or what one read in the thread.

So if you'd care to elaborate why you disagree I'd appreciate it :)
 
r

It is entirely possible that Ati's 'trylinear' is close enough to 'traditional' trilinear for it not to matter. A related example>there are those that assert any AA above 4XAA is useless at higher resolutions.

The bottomline is I would want to test the theory myself and on my own terms. A few FPS in a bench mark will not stop or compel me to buy the X800XT.

I know with my 9800pro there were some games where setting AF in the game was impossible so I had to get the level of filtering I wanted from a 3rd party application. If that avenue had not been open to me I likely would not have kept the card. There were cases where the 'standard' filtering produced/allowed rough mip map transitions and mip map banding.

I am of the mind that the X800XT is more than powerful enough to run the games I want at solid playable frame rates with the old traditional trilinear. I do not see why the feature shouldn't be made available as it would squash this entire issue for me. (And many others)

I understand the catch 22 Ati is in, and that customers want bleeding edge IQ and screaming FPS. Well, I understand when I make decisions like turning on 6xAA I am saying I am willing to take a performance hit for IQ. This filtering issue is no different.

I was constantly switching back and forth between deciding what my next upgrade would be, Ati or Nvidia, but if Ati is going to refuse me the option to test their adaptive way against traditional, I will pass. To me this has more to do with them being paranoid with review sites than anything.

There are many of us that are more than willing to take the FPS hit so we can see for ourselves. If I found the adaptive form more useful and close enough I would use it, but this is a conclusion I would have to make on my own through direct testing. IQ means far more to me than 15% performance. I hope they change their minds about the options.

EDIT: Guess I should edit this in;

I care about both of these, it is not confusion. In older games you can not use application pref/game setting because it simply isn't there in some games. I'd like to see it where third party applications are not needed to address this issue in these cases. They could address #1 as well at the same time. How about one setting or check box for both?
1- Adaptive anisotropic filtering(Off) (switch to 'traditional' trilinear)
2- Texture filtering on first texture stage vs all texture stages(altered to all stages)

This would handle x800s and 9800pro owners in one shot...
 
Evildeus said:
The fact that only 50% (less now) can't find a difference should give you the answer. If the quality was the same, i would say that at least 80% of the voters wouldn't see it.
Note that ~3/4 of the other voters did find the "good" answer.

Ah, then I understand you better. I read it more along the lines of 61% either can't see the diff, or thought the trylinear was better, and how many of the remaining 39% who really can see a diff I'm not sure since the answer, along with a helping picture, was posted in the thread thus skewing the result towards the correct answer.

Also we seem to disagree what "most of us" means wrt to %.
(You 80+, me 50/60+)

But since we both "spin" the numbers differently, I think we should just agree to disagree on them for now. Or at least until such a time when a new test can be posted, where the correct answer (and helping pictures) will not be posted (before a representative number of votes has been collected).
 
I think a good look-ahead method for ATi in the future would be to decide whether an optimization causes tiny pixel differences between its output and the DX rasterizer output. If this = true, then they should come out with a snazzy name for it. In this case, maybe something like "Quadaptive-linear filtering" might have sufficed...;)

With respect to nVidia's initial brilinear efforts, what was first noted was a visible degradation in IQ, characterized by bilinear-type mipmap boundaries being visible in images supposedly getting the benefit of trilinear filtering. It was only afterwards that we saw DX rasterizer comparisons emerge to further pin down the differences.

In this x800 case, the "testers" saw no signs of IQ degradation, but ran the rasterizer comparison anyway, just to "check," and then made all sorts of astounding conclusions based on slight pixel differences between the output and the rasterizer. In the nVidia case it was an effort to explain loss of IQ; in the present ATi case it looks to be an effort made to establish a loss of IQ where there isn't any such loss.

If we look at a so-called Microsoft quote first published by TR and then clipped and copied by THG, Microsoft said that the nV40 image is superior to the DX rasterizer, because M$ hadn't yet updated the capabilities of the DX rasterizer to match the rendering capability of newer hardware. It would seem then, that such is also very likely the case for x800, wouldn't it? So until M$ gets around to updating the rasterizer it would seem rather foolish to use it as a basis for determining IQ degradation at this stage of the game, for either nV40 or x800.

An objection to the initial nVidia brilinear scheme because of visible mipmap boundaries where there should be none seems legitimate to me. Objections to flitering optimizations which cannot be demonstrated to produce visible IQ degradation, otoh, seem spurious to me, regardless of the IHV. Who cares what method is used so long as the mipmap boundaries disappear? I certainly don't.
 
Big Bertha EA said:
Looks like Tech Report has got their own take on this issue...

http://www.techreport.com/etc/2004q2/filtering/index.x?pg=1
It's sad to see a website make this kind of article without actually absorbing what ATI said. They are still claiming ATI is "detecting" color mipmaps and turning off the optimization just to fool reviewers. If they actually read what ATI said, they would know they are comparing the two mipmaps to see how different they are then applying the appropriate amount of trilinear filtering. Artificially colored mipmaps get the full amount of trilinear filtering for one reason and one reason only...they are completely different from each other. They are using completely different primary colors of light, for one thing, which has absolutely no similarities at all.
 
Ack!

Different from trilinear does not mean inferior. Actually, the same seems to be true of current AF methodologies...a "smarter filtering" implementation might show up that might diverge even more by directly and more efficiently solving the problem of both current AF and trilinear solutions, and it would likely do that by departing from trilinear in similar circumstances as brilinear and even bilinear, but to a different result. Being sharper than trilinear filtering is good when it is sharper without increasing aliasing and the criteria for achieving that is universally applicable. It is worse when it is sharper and increases aliasing.

That unsuccessful brilinear, or even bilinear filtering, departs from trilinear at similar locations as something else does not mean that it departs to the same end result...that is a logical fallacy.

Addressing a new viewpoint on the matter:

Operation Mindcrime said:
...

Did ATI design and implement an optomization in their product without telling anyone? YES

True. The problem for the question of cheating, however, is whether it is actually an optimization. The meaning of that word in technology media has been corrupted, and you seem to be using the corrupted meaning for the basis of your conclusion. What you should say is "either always an optimization or not always an optimization", and the "verdict" would depend on which it is.

What this would illustrate is that with the proper word choice still being unknown, concluding with certainty either way (as many are doing) is premature.

Did ATI tell reviewers that while performing benchmarks, competitor products MUST be set to full trilinear? YES

Yes, this was misleading, and sets the bar very high for the methodology to have been accurately represented by that practice. The problem with your "verdict" is that you are presuming that this high bar cannot possibly met, whereas there are some characteristics of observed behavior that might still allow it to meet the following, AFAICS:
  • Always anti-alias at least as much as trilinear filtering. This is where "blurring" seems confused with "anti-aliasing"...trilinear doesn't limit its blending to effective anti-aliasing, only to fixed linearity to LOD and between mip maps. The virtue of this is consistency, the drawback is the possibility of excessive blending and therefore blur beyond that which reduces aliasing (therefore reducing color detail without benefit, so reducing image quality). To be at least as good, deviation from trilinear would have to consistently occur where excessive blur occured, but consistently still achieve as much anti-aliasing. In relation to the initial article and its "binary" deviation analysis, a technique that deviated and failed to achieve as much anti-aliasing could look similar or identical in comparison to trilinear as a technique that deviated and succeeded.
    It should be possible to achieve performance benefit from removing excessive work in texture fetch/cache demands and image quality benefit from reducing blur. This doesn't even violate the "nothing is free in 3D rule" :p, as it costs increased analytical "work" put into finding a way to achieve this consistently. This has been demonstrated clearly in things like edge AA before, it just seems more complicated to achieve or explain the consistency for this case.
  • Allow a way to access fixed linear blending for the possibility that screen space anti-aliasing is not the primary purpose of trilinear filtering (i.e., not typical color texture usage).
  • Universally/generally apply a consistent and somewhat transparent methodology of assuring the above behavior.

This relates to your final point...

Did this change skew the results of said benchmarks? YES

The answer (at the moment, and for those of us with insufficient information) is the much hated "maybe".

Here are two valid ways I see to deal with the above concerns: presume the results were skewed because you don't know that the "high bar" was met until you are shown that they met such a requirement; presuming that ATI's general indication that they met that "high bar" is supported by failure to spot an image quality degradation (not a difference to trilinear, a degradation), until such time as an image quality degradation is found.

Here are two invalid ways I see that seem common, and, in my view, conducive to circular discussions that go nowhere useful: Conclude with finality that results must be skewed because it is different than trilinear and you don't realize or simply don't care about the details of image quality and trilinear filtering, the idea of considering the "high bar" criteria I mentioned, or the distinction between actual optimization and the corrupted meaning of that word that has saturated the "media"; Conclude with finality that the methodology is perfectly suitable because you don't see evidence of image quality being worse and for the same set of reasons as above.

Conclusion:
...
Heh, disregarding the apparent levity, it is a conclusion with invalid finality, but a valid presumption or preliminary conclusion as long as there is recognition that it might yet be proven incorrect. So is proclaiming that image quality isn't degraded and a switch is therefore "absurd".
 
Evildeus said:
The fact that only 50% (less now) can't find a difference should give you the answer. If the quality was the same, i would say that at least 80% of the voters wouldn't see it.
Note that ~3/4 of the other voters did find the "good" answer.

Minor nit-pick: The question wasn't which side had better quality.
 
Back
Top