ATi is ch**t**g in Filtering

Re: Few questions:

Suspicious said:
And my 2 cents on WHQL signing and ATIs poor excuse:

1. WHQL signature means just that the driver has passed stress test and it does not have anything to do with IQ. There is no specific WHQL for display drivers, WHQL testing is the same for all drivers.

2. If it is true that Microsoft worked close with ATI when it accepted their FP24 as a DirectX standard instead of FP32 how can we rely on them being neutral when doing IQ testing if they do it at all?

not true. actually theres a buttload of specific display driver tests in the whql test kit. you can take a look at the test specs here:
http://www.microsoft.com/whdc/hwtest/pages/specs.mspx
unfortunately the texture filter – mipmapping test utilizes colored mips, which is one of the extreme cases where atis algo decides to do full trilinear.

as for the whql process, the ihv's do the tests on their own. the encrypted log files with the test results are then send to microsoft. at this point the ihv's allready know wether they get the digital signature back from them or not.
 
Bry said:
ATI's competitor did not have the same kind of optimization, nor was it similar. They never used TRI at all..it was all Bri.

And what makes you think ATI ever used real TRI except in colored mipmap test?

Suppose you buy and install an alarm that instead of being ON all the time guarding the house has "optimizations" so it turns itself OFF when your neighbour comes because it ASSUMES something about him?

Now just replace word "alarm" with graphics card, "guarding" with trilinear filtering, "house" with 3D scene and "neighbour" with mipmap level, how does it sound? Would you use such an alarm?
 
Re: Few questions:

christoph said:
unfortunately the texture filter – mipmapping test utilizes colored mips, which is one of the extreme cases where atis algo decides to do full trilinear

That is just what I had in mind when I said we don't have valid reference.
 
Suspicious said:
Bry said:
ATI's competitor did not have the same kind of optimization, nor was it similar. They never used TRI at all..it was all Bri.

And what makes you think ATI ever used real TRI except in colored mipmap test?

Suppose you buy and install an alarm that instead of being ON all the time guarding the house has "optimizations" so it turns itself OFF when your neighbour comes because it ASSUMES something about him?

Now just replace word "alarm" with graphics card, "guarding" with trilinear filtering, "house" with 3D scene and "neighbour" with mipmap level, how does it sound? Would you use such an alarm?

Isnt that what we are trying to find out..And like I said. If it is doing what ATI said it is a good thing. I am waiting on Dave to let us know what he has found out. I know he is investigating thoroughly and will let us know what he thinks. If ATI is only doing the colored mipmaps right as you suggest. Then it is a cheat plain and simple. If however they are doing what they say then can you explain why it would be a bad thing. Can you prove what they said to be wrong.

As for the alarm senerio you posted. Would it not be nice if the alarm noticed when you or a family member came to the house and automatically disarmed. Then you would not have to rush to go disarm it and cancel the alarm. If however, it recognized the neaghbor as being a family member and let them in..I would have to contact the company and have it changed. Ironically ATI said if you see any problems. Contact them and they will fix it..Unlike an alarm though. Having a slight IQ difference that can be fixed is much different than having someone break into my house and rob me or much worst :rolleyes:
 
Suspicious said:
Take a look at this shot:
http://www.aths.de/files/bri_ogl.jpg

Hint: you should notice loss of detail in the far distance and missalignment of lines.

ok where is the screenshot of the same thing using ATI's optimization? Until you can get a compair shot of both I would say this is a mute argument as there are no answers yet. It is known that Bri does have IQ issues compaired to Tri. However, it is not known if those issues are clear in ATI's optimization if it works as they say it is supposed to.
 
Bry said:
If however they are doing what they say then can you explain why it would be a bad thing

1. Because they "forgot" to tell us?
2. Because there is no option to turn it off?
3. Because they so far insisted that such "optimizations" are unfair?

Bry said:
Can you prove what they said to be wrong.

I can prove it:

1. Because they said that there is no exact definition of trilinear?

That is plain BS and you know it is not true. There is mathematical definition and we are not talking about rounding errors here.

2. Because only official filtering IQ test so far was with colored mipmaps and they admited that they revert to full TRI in that case?

That basicaly means they used "optimization" almost all the time since Cat 3.4 and with all the cards because the driver is unified and that is why nobody spotted it -- it's just because it was BAD ALL THE TIME.
 
Suspicious said:
2. Because only official filtering IQ test so far was with colored mipmaps and they admited that they revert to full TRI in that case?

That basicaly means they used "optimization" almost all the time since Cat 3.4 and with all the cards because the driver is unified and that is why nobody spotted it -- it's just because it was BAD ALL THE TIME.
Take your FUD elsewhere. This wasn't available for all cards, only the RV350/360 and mobile variants of those chips. Bad image quality all the time? Funny, not a single person pointed it out until a year later. Maybe ATI is right after all.

-FUDie
 
Suspicious said:
That basicaly means they used "optimization" almost all the time since Cat 3.4 and with all the cards because the driver is unified and that is why nobody spotted it -- it's just because it was BAD ALL THE TIME.

Sure, and you as a 9600 pro owner you have never seen it before it was told to you. I don't think you have any qualification to continue your fud.
 
@Suspicious
i think we allready have a consense here about the 'ethical' part of this 'issue' so i dont see much sense in arguing that further.....

any news on the technical side of things?
 
1. Because they "forgot" to tell us?
2. Because there is no option to turn it off?

I never disagreed with you here.
3. Because they so far insisted that such "optimizations" are unfair?

It depends. NV can use PP hints dropping to FP-16 when FP-32 is not needed (no IQ difference) And it is not concidered a cheat. NV's compiler with the det 50's changed things and increased performance, yet was never classified as a cheat. So who is to say this is any different until it is thoroughly investigated?

I can prove it:


1. Because they said that there is no exact definition of trilinear?

That is plain BS and you know it is not true. There is mathematical definition and we are not talking about rounding errors here.
Can you explain why NV's version of Triliniar, ATI's version of full triliniar and the rastorized image of full triliniar show different results. That in itself shows differences that can not be constued as an exact definition that shows equal results.
2. Because only official filtering IQ test so far was with colored mipmaps and they admited that they revert to full TRI in that case?

They admit that when ever their is a problem with IQ they revert to full try. Unfortunately colored mipmaps there would be a difference. So if their optimization is working properly it would change to full tri. Unfortunately these mipmaps are just used for testing, that is not ATI's fault. And there are people checking to make sure this is not the only place it happens. If it is found to be the case that this is the only place it happens. I will yell CHEAT just as loud as the next person.
That basicaly means they used "optimization" almost all the time since Cat 3.4 and with all the cards because the driver is unified and that is why nobody spotted it -- it's just because it was BAD ALL THE TIME.

Wow how was it bad. Every IQ comparison showed it much better than NV's cards at the time. It was the IQ king. And no one has of yet showed how ATI has worst IQ in R420 vs NV40 in this area. Which is the first time we really have to test if it works properly in the first place.
 
FUDie said:
Take your FUD elsewhere.

You can take your hostility up your a$$ as far as I am concerned -- I have every right to comment on ATI IQ since I have their card. You don't have to agree with me though.

FUDie said:
This wasn't available for all cards, only the RV350/360 and mobile variants of those chips.

Can you prove it? Can anyone explain why this would be hardware dependent "optimization"?

FUDie said:
Bad image quality all the time? Funny, not a single person pointed it out until a year later.

So ATI scr*wed us for a year and you forgive them because it was good f*ck?

If I understand correctly you actually give them credit for scr*wing us so good that we haven't even noticed it?

So that's all about it? The one who "optimizes" without being caught is good?

FUDie said:
Maybe ATI is right after all.

Yeah, they must be right.

They went to great length to optimize filtering in driver thus reducing the amount of work to be done instead of making faster hardware that could do the real work without breaking the sweat.

And they even managed to convice you that this "novel approach" is a Good Thing(TM).
 
Can you prove it? Can anyone explain why this would be hardware dependent "optimization"?

You need hardware that is specially designed to support it. Just as GeForce FX can do a mixed Bi/Trilinear optimisation and any GeForces befor couldn't do it, this was added into the ATI's hardware for RV350, but not for R300 boards - anything that was derived from the 150nm line (R300/R350/R360) has no capabilities for doing this (and this was how it was highlighted in the first place).
 
DaveBaumann said:
Can you prove it? Can anyone explain why this would be hardware dependent "optimization"?

You need hardware that is specially designed to support it. Just as GeForce FX can do a mixed Bi/Trilinear optimisation and any GeForces befor couldn't do it, this was added into the ATI's hardware for RV350, but not for R300 boards - anything that was derived from the 150nm line (R300/R350/R360) has no capabilities for doing this (and this was how it was highlighted in the first place).
Face it Dave, you're just a hopeless ATi fanboy. :rolleyes:






















;)
 
PatrickL said:
Sure, and you as a 9600 pro owner you have never seen it before it was told to you. I don't think you have any qualification to continue your fud.

You don't have a point. Just because I havent't noticed something doesn't mean it is ok to do it behind my back.

What you are basically saying is "go on people, steal from me, it is ok as long as I don't catch you".

Since everyone was taking ATIs IQ for reference what chance I had to notice anything?

Now that you mention it, I was wondering why the floor textures in some games I have played (Jedi Academy for example) appeared to "crawl" and cause eye strain if you look long enough at them.
 
I wonder does this ATI statement still hold after we have learned about their "optimization"?

http://www.ati.com/developer/sdk/RADEONSDK/Html/Info/Design.html

"The RADEON will do 3 bilinear texture fetches at its peak rate. This means that a trilinear-filtered base map modulated with a non-trilinear light map will run at full speed through the pixel pipeline. Modulating in a third trilinear-filtered detail texture increases the number of bilinear fetches to five, and drops the pixel pipeline performance. Developers should understand this tradeoff when developing on the RADEON. Three bilinear texture fetches (no matter how they're distributed) will run at peak rate on the RADEON. Anything above that will still draw correctly (i.e. true trilinear will be used rather than an approximation) but will impact performance."
 
Yo Suspicious, if'n it makes you feel any better I'm extremely miffed about how ATi dealt with choosing to keep this secret and their FUD over trilinear and reviewers using colored mipmaps to see their full trilinear in action. :?

But even though they screwed up in the way it was presented, that doesn't necessarily make it a bad thing.

I can sympathize with your aingst over being lied to, I don't like it either....but try and weigh the new technique on it's technical merits rather than your emotions about it to give it a fair chance, it really isn't all that bad and I still can't see a difference.

Granted they didn't tell us and that is without a doubt bad and wrong, but the technique itself is actually quite clever and good.
 
wow Suspicious your realy reaching with the stealing.... Most of what your going on about was disccused already, you could read the threads, and do a search.
I have been a 9600 user and the IQ is better than my 9700, only my 9800xt seemed better do to the higher res and speed i can run it at. So untill i find it to be noticeable in a game, a game i play then the IQ is great. As has been said before most of the posters like you didnt even know what filtering was let alone bi to tri to bri to try.... you sound like DCS ;)
 
Back
Top