ATi is ch**t**g in Filtering

DaveBaumann said:
The fundamental stumbling block here for people is how do you rationalise pushing for "mathematically equivelent" shader optimiatsions (via compilers), but then not push for mathematically equivelent (but potentially imperceptibly different) texture stage optimisations.

The fundamental issue isn't "mathematically equivelent". When dealing with floating point, changing the order of operations or combining seperate operations into something like a MAC_MAC_ACC, do leave to non mathematically equivelent results. This has been known for some time, and generally enforcing strick mathematical equivelence will result in slower performance ( this can be seen in the SPEC benchmarks where they specifically have an exception to the mathematically correct result requirements with margin in the result for the error caused by operation re-ordering doing compilation).

The fundamental issue is if the "performance enhancement" relies on explicit knowledge (AKA clarivoyance or ESP) of the application in order to generate its result. In this case you've jumped the snark from a general optimization that may only work in some cases to a specific optimization that may or may not work in one or a few cases and is fragile enough that it can be broken easily resulting in either an incorrect result or a significant performance differential.

IF ATI is telling the truth, then this appears to be a general case optimization that applies to most uses of the hardware. There appear to be no known cases of it breaking and should apply in general to all software and the permiations of said software. So it appears to be fairly resilent.

Though I do await more detailed analysis on its effect to actual image quality, and I'm sure that any issues with quickly come to light.

Aaron Spink
speaking for myself inc.
 
Geeforcer said:
Kombatant said:
Geeforcer said:
Kombatant said:
Geeforcer said:
They certain situations, yes, they can.

Well, if they always can, bring them on. If they can only in certain situations, no thanks.

Think of it like sound. There are certain frequencies that the human eye cannot hear. If you dismiss those from a sound file, in order to reduce its size, will you be missing out from the sound quality? No you won't.

My problem is that such approach starts accounting for taste. Some people may not see/hear the difference - to them it would be "valid". Others might see it, but not care. Another group might see it and dislike it, while yet another might see it and prefer it over the other. Thus, an optimization valid to you may not be "valid" to someone else. How would you write a review in that case?

I never talked about "perception" and "taste". You either see it or you don't.

Yes, what if some do see it and some don't?

Again, you bring subjectivity to the matter, whereas I don't. I don't care if some people don't mind. I am sure that many nVidia users don't mind having degraded quality, as long as the speed is OK. If you can see the difference, the user should be able to disable it. Simple as that. Whether he cares to do so or not, that's another issue.
 
Well, at least it is encouraging that their behavior for pre-generated mip maps seems to fit my proposal earlier.

In all other regards, their answer is not satisfactory without extensive testing every bit as stringent and severe as possible for examining image quality, because of no option to turn the optimizations off. Even assuming their statements as literally true, ATI fails by putting their self-interest in being secretive and not placing a technical expectation/whitepaper on their methodology up for scrutiny ahead of the consumer's interest in them doing some work to assure them of why the optimization is image quality equivalent. Also, it warrants vigilance to check for "image quality slippage" on the part of reviewers, even if that investigation shows image quality being maintained currently.

The problem here is establishing a methodology to accurately evaluate the assertion of image quality equivalence relatively quickly, objectively, and effectively. ATi has laid out the challenge to find any cases of image quality degradation, and we should oblige them and judge them by their response, but placing the burden of finding assurance on consumers is not the best way to treat consumers, regardless of whether image quality is lost or not. This remains a consistent area of deficiency for ATI, IMO, including where they fail to inform consumers of things that would actually show them more favorably (promoting compiler optimizations and extra ALU functionality more prominently, as nVidia does, for example). In any case, ATI's suggestion will at best build up evidence that image quality is equivalent in the cases where equivalence is observed, and only inform potential buyers beforehand if a large collection of in motion evaluation videos are generated. :-? PR doesn't seem to have a degree of technical sophistication (to provide tools/examples/more simply tested explanations as an alternative) within its purview (which can be good if it prevents misinformation, but can prevent other things), nor does engineering and software design seem to have communicating with consumers at such a level of sophistication within theirs. This is a situation exhibiting a disservice to consumers resulting from that.

*sigh* Perhaps a test pattern consisting of lines or fixed gradients, or some such, and something that analyzed for "pixel jumping" by checking for color weighting at mip map transitions compared to an analytical expectation that would represent a lack of aliasing? Based on a fixed fps, and fixed movement, etc...what a difficult task to manage. And its bass ackwards because the work is being required of people outside ATI to prove that ATI's optimization is not cheating when the basic factors observed indicate that it could be...this seems like it should be work/information ATI should at least be trying to provide to consumers for assurance. :-?
 
PatrickL said:
Or perhaps as his site is the source for that he is now blind to any explanation ? :)
Nope, if this explanation is completely valid, I'd be glad to accept it, of course.
 
demalion said:
Well, at least it is encouraging that their behavior for pre-generated mip maps seems to fit my proposal earlier.

In all other regards, their answer is not satisfactory without extensive testing every bit as stringent and severe as possible for examining image quality, because of no option to turn the optimizations off. Even assuming their statements as literally true, ATI fails by putting their self-interest in being secretive and not placing a technical expectation/whitepaper on their methodology up for scrutiny ahead of the consumer's interest in them doing some work to assure them of why the optimization is image quality equivalent. Also, it warrants vigilance to check for "image quality slippage" on the part of reviewers, even if that investigation shows image quality being maintained currently.

The problem here is establishing a methodology to accurately evaluate the assertion of image quality equivalence relatively quickly, objectively, and effectively. ATi has laid out the challenge to find any cases of image quality degradation, and we should oblige them and judge them by their response, but placing the burden of finding assurance on consumers is not the best way to treat consumers, regardless of whether image quality is lost or not. This remains a consistent area of deficiency for ATI, IMO, including where they fail to inform consumers of things that would actually show them more favorably (promoting compiler optimizations and extra ALU functionality more prominently, as nVidia does, for example). In any case, ATI's suggestion will at best build up evidence that image quality is equivalent in the cases where equivalence is observed, and only inform potential buyers beforehand if a large collection of in motion evaluation videos are generated. :-? PR doesn't seem to have such a degree of technical sophistication within its purview (which can be good if it prevents misinformation, but can prevent other things), nor does engineering and software design that would seem to have communicating with consumers within theirs. This is a situation exhibiting a disservice to consumers resulting from that.

*sigh* Perhaps a test pattern consisting of lines or fixed gradients, or some such, and something that analyzed for "pixel jumping" by checking for color weighting at mip map transitions compared to an analytical expectation that would represent a lack of aliasing? Based on a fixed fps, and fixed movement, etc...what a difficult task to manage. And its bass ackwards because the work is being required of people outside ATI to prove that ATI's optimization is not cheating when the basic factors observed indicate that it could be...this seems like it should be work/information should at least be trying to provide to consumers for assurance. :-?

I am sure they did their own set of testing for image quality degradations before using this algorithm. Should we fully trust them? No. That's why we have the media; not only in this industry, but in any industry.

On a totally unrelated note, Dem, do you ever give one-line responses man? :p :LOL:
 
:rolleyes:
IMG0007851_1.jpg

Ou can get another one over here:
http://www.hardware.fr/news/lire/18-05-2004/
 
Kombatant said:
dlo_olb said:
Kombatant said:
My take on this is simple. If you can see the difference (see=in game, not from some image substraction technique), you should be able to disable it if you wanted to. If you can't see the difference, it's valid.

when it comes to ati, the standard has relaxed to "it is ok when u cant see it"; when it is nvidia, people kept going on about 100% mathematical equivalence.

it is very simple, ati is cheating if the image is not 100% mathematical equivalence (except that it *improves* quality, tho i find it highly unlikely) and it is not doing full trilinear filtering as the program option say it is;
whether your eye can see the difference or not has never been the judge of a program cheating. I dont have any problem with it if ati has told us about this behaviour beforehand.

u gotta admit some are more equal... ;)

Allow me to disagree. As far as I am concerned, I would have no problem with nVidia's "optimizations" if my eyes weren't able to detect them. See the sound file example i posted earlier to better understand what i believe.

But unfortunately - according to your theory - for those who *do* see the difference, it is a cheating ...
 
dlo_olb said:
Kombatant said:
dlo_olb said:
Kombatant said:
My take on this is simple. If you can see the difference (see=in game, not from some image substraction technique), you should be able to disable it if you wanted to. If you can't see the difference, it's valid.

when it comes to ati, the standard has relaxed to "it is ok when u cant see it"; when it is nvidia, people kept going on about 100% mathematical equivalence.

it is very simple, ati is cheating if the image is not 100% mathematical equivalence (except that it *improves* quality, tho i find it highly unlikely) and it is not doing full trilinear filtering as the program option say it is;
whether your eye can see the difference or not has never been the judge of a program cheating. I dont have any problem with it if ati has told us about this behaviour beforehand.

u gotta admit some are more equal... ;)

Allow me to disagree. As far as I am concerned, I would have no problem with nVidia's "optimizations" if my eyes weren't able to detect them. See the sound file example i posted earlier to better understand what i believe.

But unfortunately - according to your theory - for those who *do* see the difference, it is a cheating ...

You still don't get the point; must be my English :p I will attempt to be more clear about it. Imagine that you have such an optimization in front of you. You can see it. You can either choose to ignore it, or not. What I am saying is that, if you can see it, you should be able to disable it, whether you care about it or not. It's not a matter of personal preference, it's a matter whether you see it or not.

Hope that makes sense :)
 
DaveBaumann said:
The fundamental stumbling block here for people is how do you rationalise pushing for "mathematically equivelent" shader optimiatsions (via compilers), but then not push for mathematically equivelent (but potentially imperceptibly different) texture stage optimisations.

Well, mathematically unequivalent, but (so far?) imperceptible.

ATI have moved the goalposts without telling anyone, which is why so many folks are narked. The sythetic IQ tools will give perfect tri-linear results but that's not what's being applied in-games. ATI should have made it clear, or allowed the user to disable the optimisation.

It's only a matter of time before someone comes up with a tool that makes the IQ differences apparent, perhaps a motion based IQ tool. People are like that ;)
 
Kombatant said:
You still don't get the point; must be my English :p I will attempt to be more clear about it. Imagine that you have such an optimization in front of you. You can see it. You can either choose to ignore it, or not. What I am saying is that, if you can see it, you should be able to disable it, whether you care about it or not. It's not a matter of personal preference, it's a matter whether you see it or not.

I understand your point very well, it just seem that u dont want me to understand ;)
The fact as it stand now is it seem we dont have option to disable it; and most importantly we didnt get inform beforehand by ati that this isnt the full trilinear filtering as the program options say it is, whether u can see it or not ...thats why i said it is cheating... it is not a matter of personal perference, nor a matter of whether u can see it or not...
 
dlo_olb said:
Kombatant said:
You still don't get the point; must be my English :p I will attempt to be more clear about it. Imagine that you have such an optimization in front of you. You can see it. You can either choose to ignore it, or not. What I am saying is that, if you can see it, you should be able to disable it, whether you care about it or not. It's not a matter of personal preference, it's a matter whether you see it or not.

I understand your point very well, it just seem that u dont want me to understand ;)
The fact as it stand now is it seem we cant disable it or have option to disable it; and most importantly we didnt get inform beforehand by ati that this isnt the full trilinear filtering as the program options say it is, whether u can see it or not ...thats why i said it is cheating... it is not a matter of personal perference, nor a matter of whether u can see it or not...

But why should you care, when the result with ATI's method is the same as the result you'd get with Full tri? If it isn't, I'm with you. But if it is, let's say you want to produce 10. Does it matter if you add 3+7 instead of 4+6?
 
Can anyone in the know look up in ATis Benchmark Guide PDF if the following might be the result of a corrupted download on my side?
ATi%20PDF%20on%20Benchmarking%20Section%20Texturing%20Bottlenecks.JPG
 
atis answer is just pr....no surprise here. microsoft whql, well you dont even need trilinear to pass that test afaik

5.554.1 Graphics adapters must support Direct3D compliant MIP-mapped textures.

5.554.2 Graphics adapters must support Direct3D compliant bilinear or better filtered textures
 
DSC said:
IMG0007852_1.jpg


Trilinear on by default? :LOL: :LOL: :LOL:

I don't think that passes the laugh test..... :rolleyes:
How narrow minded! Trilinear is on by default, however the driver/hardware will optimize the filtering when possible. Not real complicated. If the mipmaps aren't box filtered mipmaps (i.e. they are colored), then it reverts to the default setting of trilinear.

People are comparing this to what NVIDIA did for the last year with the Geforce FX series, which is a real disservice to ATI. It looks like ATI has implemented a clever scheme to analyze texture data in order to carefully balance quality and performance with little image quality differences. What NVIDIA did in the past was simply a global enabling of "brilinear" that looked awful because no care to image quality was taken. ATI and NVIDIA's methods may use some of the same tools (brilinear) but the results are completely dissimilar.

-FUDie
 
christoph said:
atis answer is just pr....no surprise here. microsoft whql, well you dont even need trilinear to pass that test afaik

There are actually specific IQ tests that you have to be within certain tolerences, these include Trilinear filtering.
 
Back
Top