ATi is ch**t**g in Filtering

Someone had to do it. I've attached the resulting picture of the difference calculation of the "cheat demo" of the 1.png and 2.png pictures. Black areas indicate no difference and anything else indicates differences between the two images.
EDIT:removed pic

So much for "These pictures exhibited easy differences in some places (bsw. marble tiles in the foreground and the rock lower surface in the background)." It amasing what you can see if you really want to see it badly enough.

You can download a png file here:
diff.png
 
Should the penatly for the difference away from the proper image be linear? of course not with most things such as the normal distrubtion it should be squared. I went and did this back with my firsts posts on the forum with the NV30 cheats and I think the exact same treatment should be done with ati.

http://www.lexicon.net/mccann/diff2.png
 
Bjorn said:
MrGaribaldi said:
Wouldn't that be possible to add into the compiler so that it recognised when such situations arises, and then can deal with them regardless of the game/app?

I'm guessing that it will be very difficult to do a perfect compiler that cover all the corner cases.

Agreed :)
Bjorn said:
MrGaribaldi said:
But as far as I understood davepermen correctly, that would fall in under the bugfix-part of his post.

If the situation only occurs under certain circumstances and the performance is still great when it happens that i doubt that the developers would call it a bug. And what if the performance is say, 20% below what it could be ? That's definitely not enough to immedietly call it a bug and the difference between being 10% faster then your competitor or 10% slower. That's a very big difference for a IHV, unfortunetly.

Well, but if the IHV has allready done the work which would enhance the performance, it shouldn't be a problem for the ISV to include it in a/the next patch shoult it?
The "only" things they'd have to do is compability testing with other cards and add it to the patch.... (Yes, I'm sure there is more, but these are what springs to mind)


But maybe we should continue this discussion elsewhere, since it's getting more and more OT from what the thread is about...?
 
Please note, that we were not comparing Shot 1 to Shot 2 but Bilinear-AF and Trilinear-AF on the respective cards...
(dunno how many times i've had to mention it 'til now).

additionally:
diff.jpg


Nice compression artefacts your difference-pic exhibits besides the inevitable differences of the same scene rendered on two different architectures...
 
ChrisW said:
Chalnoth said:
Where can we see the original image(s), Bloodbob?
http://www.computerbase.de/artikel/hardware/grafikkarten/versteckspiele_atis_texturfilter/4/
1.png and 2.png

There are slight differences between the images, but not visible differences. If you increase the contrast around 100%, you can enhance the difference between the shades of black.

With those images, I found that a 200% zoom and 400% increase in contrast was the only way to get the 'major visible differences' up to a level where you could see them.

I've already had this debate at Rage3D though pointing this out, as well as mentioning that there are visible differences that you can see between the two images, but that 'different' doesn't necessarily mean 'worse'. But then I got lambasted by Quasar for it, so there you go.
 
Quasar said:
Please note, that we were not comparing Shot 1 to Shot 2 but Bilinear-AF and Trilinear-AF on the respective cards...
(dunno how many times i've had to mention it 'til now).

additionally:
diff.jpg


Nice compression artefacts your difference-pic exhibits besides the inevitable differences of the same scene rendered on two different architectures...
LOL! I guess you didn't see the .png file at the bottom you can download? Why would you try to enhance a jpeg file? The jpeg file is only there so everyone does not have to download the png file. bloodbob didn't seem to have a problem with this.

Are you trying to tell me they are showing us bilinear on one of the cards and trilinear on the other? And one of them is a 9800XT while the other is a X800? It's great that you increased the contrast to 100%, now why don't you show us a picture of the actual game increased by 100% and let us see if that is how we are going to play the game? What you are doing is taking several shades of black and inhancing them to make them appear different colors.
 
christoph said:
@ChrisW
it doesnt make much sense to do a bit compair difference between those two pics for obvious reasons.
Which are... :?:
The entire point is to show people the difference between ATI's full trilinear and their adaptive trilinear filtering (which nobody seem to want to do). Nobody has a problem with showing bit difference pictures of the X800 and the NV17.
 
ChrisW said:
christoph said:
@ChrisW
it doesnt make much sense to do a bit compair difference between those two pics for obvious reasons.
Which are... :?:
The entire point is to show people the difference between ATI's full trilinear and their adaptive trilinear filtering (which nobody seem to want to do). Nobody has a problem with showing bit difference pictures of the X800 and the NV17.
Its not adpative.

Use the PNG's he just didn't display the png on the screen because people who are still on modems like me don't want to wait forever to download the damn image. ( I hate monopolies espically tel$tra ).
 
@ChrisW
like already said you try to [edit]bit[/edit] compair two different chips/architectures thus the result will not show what you want to......this is not what cb has done. reread the thread please. its on page 2 or so
 
christoph said:
@ChrisW
like already said you try to compair two different chips/architectures thus the result will not show what you want to......this is not what cb has done. reread the thread please
That doesn't make any sense. What you are seeing is the visual difference between the two pictures. The allegation is that ATI is lowering visual quality for speed. If there is no visual difference then where is the lower image quality? Besides, we all know the X800 is based on the same arthitecture as the 9800XT. What are your feelings of the other difference pictures? Are they also invalid?

If these tests are invalid then how are we supposed to judge image differences?

EDIT: Maybe the difference picture is not a fair way to judge image quality differences after all. There must be some way to accurately judge the difference between the two trilinear methods. Anyone have any ideas?
 
ChrisW said:
christoph said:
@ChrisW
like already said you try to compair two different chips/architectures thus the result will not show what you want to......this is not what cb has done. reread the thread please
That doesn't make any sense. What you are seeing is the visual difference between the two pictures. The allegation is that ATI is lowering visual quality for speed. If there is no visual difference then where is the lower image quality? Besides, we all know the X800 is based on the same arthitecture as the 9800XT. What are your feelings of the other difference pictures? Are they also invalid?

Yes the X800 is based on 9800XT but there are still some small difference that make it impossible to compare images of this two chips in this way.

You can take a shot with Bi-filter on both chips and compare it with TheCompresator. You will find they small difference all over they place.

I want to invite you to a little experiment. Take one of your games and make two shot with the same settings. The only thing that you should change between this two shots ist the texturefilter. Swap it from trilinear to bilinear. Compare this two shots with the same method you allready used and tell us your conclusion.
 
Demirug said:
ChrisW said:
christoph said:
@ChrisW
like already said you try to compair two different chips/architectures thus the result will not show what you want to......this is not what cb has done. reread the thread please
That doesn't make any sense. What you are seeing is the visual difference between the two pictures. The allegation is that ATI is lowering visual quality for speed. If there is no visual difference then where is the lower image quality? Besides, we all know the X800 is based on the same arthitecture as the 9800XT. What are your feelings of the other difference pictures? Are they also invalid?

Yes the X800 is based on 9800XT but there are still some small difference that make it impossible to compare images of this two chips in this way.

You can take a shot with Bi-filter on both chips and compare it with TheCompresator. You will find they small difference all over they place.

I want to invite you to a little experiment. Take one of your games and make two shot with the same settings. The only thing that you should change between this two shots ist the texturefilter. Swap it from trilinear to bilinear. Compare this two shots with the same method you allready used and tell us your conclusion.
Funny thing is that is exactly what I just did before editing my previous post. I came to the same conclusion as you.
 
ChrisW said:
EDIT: Maybe the difference picture is not a fair way to judge image quality differences after all. There must be some way to accurately judge the difference between the two trilinear methods. Anyone have any ideas?

If we have to think this hard to find the 'difference,' then maybe the change is irrelevant.
 
bit comparisons are useless. the architectures don't need to be bit-equal, everyone is free how to bitwise implement something (not only in the filtering unit:D).

simple difference images are useful. if you scale them up till you see a difference, you can tell if there is one at all. if you don't scale them up till you see a difference but take the raw difference, you can see if there really is any visual artefact at all.

there is one, huge, visible artefact there. and thats the flame. and thats just because its animated. for the resting image, there is simply a filter running, wich looks equal to trilinear, and the difference is not visible with the eye (espencially if you can't compare to the "real tri" mode.

people try to map this optimisation onto the brilinear escalation on nvidias hw, but there is now about enough proof that it does not behave the same. instead, it works.



and about corner cases: you can't possibly solve all corner cases, this is true. too much pathways. but you can, 100% estimate how good, or bad a value can be (determining the range in wich the value should be, and evaluate if it can be in there.. thats simple statistics, and something done in high-end-renderings, raytracers, global illumination solutions, etc, all the time). ati can implement such an estimator, and when ever it sees an issue, drop out and trilinear. that way, 100% of all corner cases are solved. that is a conservative solution, because in quite some corner cases, we could have the faster filtering and would not see anything.

but it doesn't mather. it guarantees instead one thing: there is no visible difference. all chances where there IS one ARE handled. variance is calculatable.
 
dksuiko said:
If we have to think this hard to find the 'difference,' then maybe the change is irrelevant.

to whom?
from an ethical/pr point of view its not.
from a technical point of view its not.
from a reviewer point of view its not.
from a developer point of view its not.
from a customer/gamer point of view its not.

at least imo ;)
 
christoph said:
from an ethical/pr point of view its not.
from pr, 100% fine (they all lie). ethical.. depends. theoretically. yes, not okay.
from a technical point of view its not.
from this view, it is fine. image "differences" show the algorithm replaces the old way perfectly fine (or fallbacks all the time), so its fully "backward and forward compatible". thats okay.
from a reviewer point of view its not.
well.. more okay than all the fancy ways to detect now its not the same. it looks like some pages just do this to get a lot of clicks now. AHH A WHITE PIXEL THERE IS DIFFERENCE.
from a developer point of view its not.
i don't care as a developer because of the technical point above.
from a customer/gamer point of view its not.
there is no gamer/customer that CAN see a difference by eye. and most of them won't even understand what trilinear means at all (or, filtering, for gods sacke:D). THEY don't bother. they get the same image. they are happy. it could be a played dvd, as long as it looks the same, and feels the same, and is the same.

at least imo ;)
ditto :D
 
christoph said:
dksuiko said:
If we have to think this hard to find the 'difference,' then maybe the change is irrelevant.

to whom?
from an ethical/pr point of view its not.
from a technical point of view its not.
from a reviewer point of view its not.
from a developer point of view its not.
from a customer/gamer point of view its not.

at least imo ;)

Hmm.. Well, from a practical standpoint, I could careless so long it's not a significant degradation in quality (in this case, very insignificant). But since ATI chose not to tell anyone and leverage this as full trilinear when it wasn't, I guess some criticism is warranted. In any case, I'm glad this whole situation was brought to light so we know whats really going on. With that said, I don't think this little optimization/cheat, whatever you want to call it, deserves the whole witch hanging its getting.

But we all draw our lines in different places, so I'll leave it at.
 
Back
Top