Trilinear Filtering Comparison(R420 vs R360 vs NV40 vs NV38)

Hellbinder, you have got to be at your absolute most hypocritical right now.

You continually created a huge raucus about NVidia's brilinear, but now that ATI is doing the same thing, you're defending them vehemently. I always thought NVidia's FX filtering method is a pretty good way of saving performance with relatively little IQ cost, but you were raising hell then. Now your saying ATI isn't affecting image quality. If you want me to quote you a few times, I will (or someone else will, I'm sure).

Take a look at this post from Lars, and ATI's benchmarking comments:
By default, one of our competitors does not offer Trilinear filtering, even when it’s requested by the application. By default, it uses an enhanced Bilinear technique commonly referred to as “brilinearâ€￾ filtering. This can be disabled, after several mouse-clicks, from within their control panel.

ATI is doing the same filtering, but they're asking for it to be turned off with NVidia cards! Completely unfair, and very decietful. Now, it is entirely possible that ATI's benchmarking guide was written by people not aware of this, or they put this comment in their guide before R420 was introduced and forgot to remove it. Either way, several sites, like Tech-Report and HardOCP, disabled trilinear optimisations on NV40 when comparing to R420.

ATI should have a box in their driver with three options for trilinear filtering optimization: "Always on", "Always off", and "Auto".
 
trinibwoy said:
Outside of the COD screenshots is there any evidence which points to reduced filtering quality of the R420 vs the previous generation ATI products?

Exactly. Bring on some more screenshots made by someone who is more careful than guys at digit-life ;)

Oh and short video capture would be nice too (wasn't there videos of GFFX brilinear in the past?).
 
Malfunction said:
Quote:

Coloured mipmaps naturally show full trilinear - statement

our image quality analysis reveals that there could be visible differences in the image. - issue

It should be noted that trilinear was originally invented to smooth transitions between mip-levels, and in the original definition mip-levels should be filtered versions of each other, as coloured mip-levels clearly are not. - statement

Despite this, we understand that people often make use of hardware for purposes beyond that originally envisioned, so we try to make sure that everything always functions exactly as expected. - issue

:?

whats your point . There could be visable diffrences with other video cards. Yet all other video cards have visual diffrences with the refrence rastner . IF there is a % diffrence allowed shouldn't that extend to this ?

Not only that but they say could. They also ask in thier release that people report any issues .



How is the other one an issue ? THey are making sure that everything always functions as expected. To me that is not an issue that is a good thing.

I like it when things work as expected. You know like when my car starts. When my shower works. All that is good and is working as expected.
 
ATI should have a box in their driver with three options for trilinear filtering optimization: "Always on", "Always off", and "Auto".

The only reason to have thos three options would be that always off would allow benchmark bullcrap... If the Level of IQ is the same between Always on and Auto why have the option. What would be the business reason or advantage to the consumer??
 
Stryyder said:
The only reason to have thos three options would be that always off would allow benchmark bullcrap... If the Level of IQ is the same between Always on and Auto why have the option. What would be the business reason or advantage to the consumer??

I think that's an issue here that certain people are glossing over. Nobody except ATI knows that the level of IQ is the same between an R420 with pure TRI compared to an R420 with optimized TRI. And we never will since ATI refuses to expose the capability. So we cannot make any assumptions that this optimization is the same as pure Tri on an R420. So please stop saying this as if it is gospel. If you want to say that optimized TRI on R420 is better than anything else out there....that's something else entirely. ;)
 
trinibwoy said:
Stryyder said:
The only reason to have thos three options would be that always off would allow benchmark bullcrap... If the Level of IQ is the same between Always on and Auto why have the option. What would be the business reason or advantage to the consumer??

I think that's an issue here that certain people are glossing over. Nobody except ATI knows that the level of IQ is the same between an R420 with pure TRI compared to an R420 with optimized TRI. And we never will since ATI refuses to expose the capability. So we cannot make any assumptions that this optimization is the same as pure Tri on an R420. So please stop saying this as if it is gospel. If you want to say that optimized TRI on R420 is better than anything else out there....that's something else entirely. ;)

Until someone proves it otherwise I will continue to say it especially since all the previews of x800 shows it has IQ at the same or possibly higher levels than the previous generation product and the competing NV40.
 
trinibwoy said:
I may have missed it but have we determined the reason for filtering differences between R360 and R420 or have such differences even been confirmed? Outside of the COD screenshots is there any evidence which points to reduced filtering quality of the R420 vs the previous generation ATI products?

Good points, as I think this is indeed a case of much ado about nothing, and I think you've pretty much nailed the situation.
 
Stryyder said:
Until someone proves it otherwise I will continue to say it especially since all the previews of x800 shows it has IQ at the same or possibly higher levels than the previous generation product and the competing NV40.

Stryyder has the point in his sights....steadies his aim....HE MISSES!!!!
 
trinibwoy said:
Stryyder said:
The only reason to have thos three options would be that always off would allow benchmark bullcrap... If the Level of IQ is the same between Always on and Auto why have the option. What would be the business reason or advantage to the consumer??

I think that's an issue here that certain people are glossing over. Nobody except ATI knows that the level of IQ is the same between an R420 with pure TRI compared to an R420 with optimized TRI. And we never will since ATI refuses to expose the capability. So we cannot make any assumptions that this optimization is the same as pure Tri on an R420. So please stop saying this as if it is gospel. If you want to say that optimized TRI on R420 is better than anything else out there....that's something else entirely. ;)

but in the isn't tthat whats important


the best iq and the fastest ?
 
trinibwoy said:
I think that's an issue here that certain people are glossing over. Nobody except ATI knows that the level of IQ is the same between an R420 with pure TRI compared to an R420 with optimized TRI. And we never will since ATI refuses to expose the capability. So we cannot make any assumptions that this optimization is the same as pure Tri on an R420. So please stop saying this as if it is gospel.
The man (or "bwoy" in this case ;) ) has a very legitimate point and I for one am not going to gloss over or dismiss it.

I think that's the problem I'm having in getting my head around this whole trilinear-dealy-thingy, I haven't really seen any screenshots that show the difference to me.

I'm not saying there is no difference or that there is, just that I haven't really seen it yet and until we can do comparisons there just isn't any way to really tell. :(
 
Until someone proves it otherwise I will continue to say it especially since all the previews of x800 shows it has IQ at the same or possibly higher levels than the previous generation product and the competing NV40.

We wouldnt expect anything less from you.
 
jvd said:
but in the isn't tthat whats important ... the best iq and the fastest ?

Yes that's what's most important in the context of a consumer purchase. I was addressing ATI's claim that no IQ was sacrificed. How are we to know that an R420 without optimizations would give even better IQ?
 
pino said:
...
what's this supposed to mean ? i really don't get it.... , can someone translate please ? :?

My extremely non-technical interpretation of what he's trying to say

Coloured mipmaps naturally show full trilinear as our image quality analysis reveals that there could be visible differences in the image.

There could be visible differences when using full trilinear versus "smart/optimized/less-than-full trilinear", so the driver defaults to full trilinear.

It should be noted that trilinear was originally invented to smooth transitions between mip-levels, and in the original definition mip-levels should be filtered versions of each other

So, in this case, there would be no visible difference when using "smart/optimized/less-than-full trilinear"

as coloured mip-levels clearly are not

And so the driver uses full trilinear ;).

Despite this, we understand that people often make use of hardware for purposes beyond that originally envisioned, so we try to make sure that everything always functions exactly as expected.

:?: No clue here...


Cheers, Joey.
 
digitalwanderer said:
trinibwoy said:
I think that's an issue here that certain people are glossing over. Nobody except ATI knows that the level of IQ is the same between an R420 with pure TRI compared to an R420 with optimized TRI. And we never will since ATI refuses to expose the capability. So we cannot make any assumptions that this optimization is the same as pure Tri on an R420. So please stop saying this as if it is gospel.
The man (or "bwoy" in this case ;) ) has a very legitimate point and I for one am not going to gloss over or dismiss it.

I think that's the problem I'm having in getting my head around this whole trilinear-dealy-thingy, I haven't really seen any screenshots that show the difference to me.

I'm not saying there is no difference or that there is, just that I haven't really seen it yet and until we can do comparisons there just isn't any way to really tell. :(

Good point but my point is that comparing x800 method 1 to x800 method 2 is meaningless what really matters is comparing x800 against the NV40 and the R3xx series if it shows improvement over the R3xx and is on par or better than the NV40 this is a null point.
 
trinibwoy said:
jvd said:
but in the isn't tthat whats important ... the best iq and the fastest ?

Yes that's what's most important in the context of a consumer purchase. I was addressing ATI's claim that no IQ was sacrificed. How are we to know that an R420 without optimizations would give even better IQ?

Does it matter ? What if its the same ? What if its so great that nothing comes close to it ? What then ? Do we now test this uber trilinear which looks so great to nv40s trilinear ? Would that be a fair test ?
 
jvd said:
Malfunction said:
Quote:

Coloured mipmaps naturally show full trilinear - statement

our image quality analysis reveals that there could be visible differences in the image. - issue

It should be noted that trilinear was originally invented to smooth transitions between mip-levels, and in the original definition mip-levels should be filtered versions of each other, as coloured mip-levels clearly are not. - statement

Despite this, we understand that people often make use of hardware for purposes beyond that originally envisioned, so we try to make sure that everything always functions exactly as expected. - issue

:?

whats your point . There could be visable diffrences with other video cards. Yet all other video cards have visual diffrences with the refrence rastner . IF there is a % diffrence allowed shouldn't that extend to this ?

Not only that but they say could. They also ask in thier release that people report any issues .



How is the other one an issue ? THey are making sure that everything always functions as expected. To me that is not an issue that is a good thing.

I like it when things work as expected. You know like when my car starts. When my shower works. All that is good and is working as expected.

anonymouscoward: Image quality is a relative term. The real question is, "does the claimed 'trilinear filtering' produce a byte-for-byte replica of 'true trilinear filtering'?" Whether or not the image quality is "the same" or "essentially" the same is irrelevant to this questions

Andy/Raja: Byte for byte compared to what? "True trilinear" is an approximation of what would be the correct filtering, a blending between two versions, one which is too blurry and one too sharp. An improved filter is not byte for byte identical to anything other than itself, but that doesn't mean it isn't a better approximation.

There are "apparently" visible between their own products, let alone getting another IHV involved.

You're not gonna have filtering that appeals to everyone. Some will spot the usefullness in other areas and so on.

Function as expected... according to what? Reviewers, ananlyst and enthusiast were all expecting full Trilinear filtering. ATi hadn't mentioned that they may not be producing it at all times.

You are celebrating their enthusiasm about working with people who discover the problems, yet you are not happy with hearing the problems being discovered or investigated. It is going to help them resolve the issue eventually, though how is anyone suppose to react when they see something that wasn't even discussed at launch before?

It's like asking someone to identify a problem with a Armegus. You gotta know WTF a Armegus is and what it is supposed to do before you can report a problem with it doing what it was ment to do.
 
Stryyder said:
Good point but my point is that comparing x800 method 1 to x800 method 2 is meaningless what really matters is comparing x800 against the NV40 and the R3xx series if it shows improvement over the R3xx and is on par or better than the NV40 this is a null point.

Whoa!!! :LOL: You're changing the rules on the fly man. Let's assume we have R420A (optimized) and R420B (regular). Now we don't know if R420B has better IQ than R420A cause ATI is being stingy. But let's assume that R420B has better IQ since I don't feel like taking ATI's word for it. You would prefer R420A over R420B if A has an extra 2-5% advantage over NV40? Why does this comparison have to be made in context of this discussion at all?
 
Stryyder said:
Good point but my point is that comparing x800 method 1 to x800 method 2 is meaningless what really matters is comparing x800 against the NV40 and the R3xx series if it shows improvement over the R3xx and is on par or better than the NV40 this is a null point.

You're right and wrong all at the same time. :p

Sure, if an X800 is compared against a 9800 and 6800 and comes out on top filtering wise, this is great and that card gains the IQ crown in that discipline. But - Are you still getting the best possible quality for that particular board? Comparing it to other boards isn't going to answer that question (although it can help us make suppositions), only having the full gamut of filtering options will allow us to see for ourselves.

Besides which, what if ATi decided to introduce this new filtering on R3x0 boards (which we only assume isn't possible right now) - Then what do we compare to?
 
DemoCoder said:
If we discount this driver optimization, does this mean, hardware wise, the XT is losing to the 6800U in texture fillrate benchmarks? Has anyone adjusted their benchmarks or ran them with colored mips? If so, this would be surprising given the extra memory bandwidth and fillrate of the XT.
ATi (or at least OpenGL Guy, who isn't really speaking for ATi when he's posting here ;)) has already claimed that their memory controller could use some tweaking, so perhaps immediate fillrate testing won't give us a glimpse beyond how optimized their drivers are?

Remember that the XT's extra bandwidth amounts to 20MHz above a 6800U, so that seems like a non-issue. The XT's tremendous theoretical fillrate does seem to be getting lost somewhere relative to the 6800U, though--provided the 6800U isn't optimizing for texture fillrate tests (if it's possible--yes, I'm still suspicious of nV, and now somewhat of ATi).
 
Back
Top