Chat Transcript: ATI's texture filtering algorithms

radar1200gs said:
...
Oh really? Then try explaining why ATi reccomends reviewers disable Brilinear filtering on nVidia products when benchmarking but says not a word about their own trilinear optimizations.

Very simply, if ATi's brilinear > nVidia's brilinear in terms of general IQ, such that you'd have to turn off nVidia's brilinear to get close to ATi's general standard IQ, including brilinear, then such advice would be very helpful in terms of providing a fair comparison.

The point to remember here is that just as nV40 does not = R420, Forcenator brilinear does not equal Catalyst brilinear. If both were equal then it would indeed not be fair to turn one off while keeping the other on, but since they aren't equal, any comparisons which assume equivalents between them are themselves invalid, since they assume non-existent equalities.

IE, it does not follow that IQ is equal when both are on, or both are off, between nV40 and R420, and their respective drivers.
 
Drak said:
MrGaribaldi said:
With nvidia you would only get brilinear at any given time. It would not do trilinear filtering on colour mimaps or any other texture with mimaps not generated from a box-filer, whereas with ATI's trylinear you would.

Are you complaining that nvidia should do full trilinear filtering on colour mipmaps? The purpose of the colour mipmaps is to show us the filtering patterns and thankfully, when they're used on nvidia cards, everybody can see what the filtering pattern of nvidia's brilinear is like.

I think you'd better read my entire post to get the context, and not just that part alone, as you are misunderstanding what I'm trying to say...

My point is that to get a fair comparison when using coloured mipmaps one should compare nvidia's trilinear and ati's trylinear, as that would be comparing trilinear with trilinear. If you did not turn of nvidia's brilinear nvidia would/could be judged to have a much worse trilinear implementation than ati. (Yes, this is to be fair with nvidia, not ati)

However, when testing in general I'd say it was up to the reviewer to compare the IQ of the different settings to get the best "apples-to-apples" comparison. And it is this the rest of my previous post goes on to ask questions about...
 
lyme said:
MrGaribaldi said:
So the question becomes; what optimisations are equal and should be on during testing?

Using ATI's suggested testing methods they should be allowed to use any/all optimizations while nvidia uses none. Thankfully for nVidia they have kept their mouth shut from all the shit they caught last time.

which is where I think the reviewers opinions come into play.
Instead of just disabling all nvidias optimisation and leaving ati's in play, they should do an IQ comparison and show the user why they feel that they achieved a fair comparison between the two cards at what settings.
(Ie. nvidia with brilinear on, and ati's trylinear at halfway between quality and performance (no idea if that would yield comparable IQ, it's just an example))

lyme said:
MrGaribaldi said:
Can you really say that trylinear and brilinear gives the same result and should both be on during testing, or is trylinear so close to trilinear that it's fairer to compare trylinear with trilinear?

No in both cases. If your going to call it a full trilinear vs full trilinear test you better be using full trilinear on both. In addition if ATI tells you it's full trilinear you expect full trilinear and not a close faximile.

Agreed!
Which is why I think (along with a lot of others) that ati should've labeled it as something else. If they should leave legacy-trilinear as a setting is another question...

lyme said:
MrGaribaldi said:
Also, what if you're interesetd in testing each cards trilinear?
By using textures with coloured mipmaps on ATI hardware you get full-/legacy-trilinear and as such, wasn't it correct for ATI to tell reviewers to turn nvidia's brilinear off, so one could compare their respective trilinear filtering? Otherwise the difference in quality in that test would be extreme since ati would be doing full-/legacy-trilinear whereas nvidia would've only done brilinear.

true, however I don't know how much a difference in quaility there would be comparing trilinear vs brilinear nor the performance of nvidia in that comparision. I don't remember seeing any reviews using brilinear.

Me neither. Which is part of what I feel is becoming a problem with shoot-outs lately. The "find comparable IQ settings" has taken a back-seat to "just set everything to 2x and it's OK", with certain notable exceptions.
(But I've been saying this quite a lot lately, so I'll spare you for yet another iteration of it)

lyme said:
In addition no matter how many people tell me the quality of trylinear is great and wonderful, if it's not trilinear you can't call it that. It seems alot of people are trying to say 'it looks close, so who cares'. Well we went through that with nvidia last time and it didn't fly, why should it now?

Agreed.



lyme said:
MrGaribaldi said:
Also, how close does an optimisation need to be to the original result before it is judged to be a valid way of rendering that effect?
Or how close does an optimisation need to be to a "correct" level of IQ before it's judged valid?

I'm quite curious about this, since it hasn't been discussed that much (/at all?)

Personally I don't own a video card capible of running decently games with either AA or AF on, so I don't see the big deal. It is generally the case where if you don't notice a problem during gameplay then its not a issue.
However since the release of the nv3x ATI-fans have been jumping all over nvidia, sometimes over the smallest nit picky things. I'm not one bit surprised that when ATI pulls something that nvidia-fans don't do the exact same thing to them in return.

It seems we agree on most points. And at least this issue has given nvidia fans something else than the "Quack" incident to talk about... ;)

And if anynoe else would like to share their thoughts about these questions, it'd be interesting :)
 
MrGaribaldi said:
I think you'd better read my entire post to get the context, and not just that part alone, as you are misunderstanding what I'm trying to say...

I was just asking for a clarification on the colour mipmaps.

MrGaribaldi said:
My point is that to get a fair comparison when using coloured mipmaps one should compare nvidia's trilinear and ati's trylinear, as that would be comparing trilinear with trilinear. If you did not turn of nvidia's brilinear nvidia would/could be judged to have a much worse trilinear implementation than ati. (Yes, this is to be fair with nvidia, not ati)

Colour mipmaps are usually used to compare filtering patterns and IQ. Why can't the reviewer just put all the filtering patterns up and leave it to us to compare? It's up to us to judge the filtering pattern of nvidia's brilinear, trilinear and ati's trilinear, but we cannot see ati's trylinear because it breaks the colour mipmap test.

MrGaribaldi said:
However, when testing in general I'd say it was up to the reviewer to compare the IQ of the different settings to get the best "apples-to-apples" comparison. And it is this the rest of my previous post goes on to ask questions about...

The reviewer can give his opinion of what's "apples-to-apples" but i think it's much better if he lumps the brilinear, trilinear and trylinear into one chart and leave it up to the readers to pick the filtering method's they
prefer.
 
WaltC said:
Very simply, if ATi's brilinear > nVidia's brilinear in terms of general IQ, such that you'd have to turn off nVidia's brilinear to get close to ATi's general standard IQ, including brilinear, then such advice would be very helpful in terms of providing a fair comparison.

IF ATI brilinear > nV brilinear and ATI trilinear < nV trilinear, then wouldn't it seem a bit unfair to remove nVs brilinear and leave ATi brilinear in place, when even ATis trilinear is inferior to nVs trilinear?
 
Quasar said:
IF ATI brilinear > nV brilinear and ATI trilinear < nV trilinear, then wouldn't it seem a bit unfair to remove nVs brilinear and leave ATi brilinear in place, when even ATis trilinear is inferior to nVs trilinear?
I think this might actually have made sense... but I'm not so sure. o_O Maybe it'll help if we all remove the "linear" part from everything--it's just confusing me now! :p
 
Quasar said:
WaltC said:
Very simply, if ATi's brilinear > nVidia's brilinear in terms of general IQ, such that you'd have to turn off nVidia's brilinear to get close to ATi's general standard IQ, including brilinear, then such advice would be very helpful in terms of providing a fair comparison.

IF ATI brilinear > nV brilinear and ATI trilinear < nV trilinear, then wouldn't it seem a bit unfair to remove nVs brilinear and leave ATi brilinear in place, when even ATis trilinear is inferior to nVs trilinear?

actually atis try is about equal to nvidias tri. it's espencially a good choise to call it try, because it trys to cheat, and if it sees it would get visible, it stops cheating. thats what bri can't do, and because of that, bri < try & tri.

but try = tri, because, if it isn't, it is tri (ouch, that reminds of the 1) i'm always right, 2) if 1) is not true, 1) gets true).
 
WaltC said:
radar1200gs said:
...
Oh really? Then try explaining why ATi reccomends reviewers disable Brilinear filtering on nVidia products when benchmarking but says not a word about their own trilinear optimizations.

Very simply, if ATi's brilinear > nVidia's brilinear in terms of general IQ, such that you'd have to turn off nVidia's brilinear to get close to ATi's general standard IQ, including brilinear, then such advice would be very helpful in terms of providing a fair comparison.

The point to remember here is that just as nV40 does not = R420, Forcenator brilinear does not equal Catalyst brilinear. If both were equal then it would indeed not be fair to turn one off while keeping the other on, but since they aren't equal, any comparisons which assume equivalents between them are themselves invalid, since they assume non-existent equalities.

IE, it does not follow that IQ is equal when both are on, or both are off, between nV40 and R420, and their respective drivers.

:rolleyes: Oh boy, how did I know you would totally misinterpret what I wrote? :rolleyes:

let me spell it out for you.

ATi tells reviewers that they are using full trilinear and encourages reviewers to disable nVidia's brilinear filtering to "level the playing field". This is a proven fact. Go check out the Tech Report article.
http://techreport.com/etc/2004q2/filtering/index.x?pg=1

You CAN compare otimized vs optimized,
You CAN compare unoptimized vs unoptimized,
You CANNOT compare optimized vs unoptimized.

And yes, nVidia was just as guilty as ATi with NV3x for forcing on optimizations and providing no mechanism for users to turn them off.

I don't care who you are, don't do it!

-----
Lyme wrote: "no matter how many people tell me the quality of trylinear is great and wonderful, if it's not trilinear you can't call it that. It seems alot of people are trying to say 'it looks close, so who cares'. Well we went through that with nvidia last time and it didn't fly, why should it now?"
 
radar1200gs said:
You CAN compare otimized vs optimized,

If the optimisations are in the same ball park :)

Though since ATI doesn't give you an option to remove those optimisation I'm happy to let ppl do NV vs ATI optimised benchies.
 
bloodbob said:
radar1200gs said:
You CAN compare otimized vs optimized,

If the optimisations are in the same ball park :)

Though since ATI doesn't give you an option to remove those optimisation I'm happy to let ppl do NV vs ATI optimised benchies.

It doesn't matter if the optimizations are in the same ballpark or not. All that matters is that they are optimizations.

Obviously some optimizations may prove better than others. that is what reviews should attempt to show.

-----
Lyme wrote: "no matter how many people tell me the quality of trylinear is great and wonderful, if it's not trilinear you can't call it that. It seems alot of people are trying to say 'it looks close, so who cares'. Well we went through that with nvidia last time and it didn't fly, why should it now?"
 
Back
Top