ATi is ch**t**g in Filtering

From some of the prior linked discussion, ATI seems to have indicated that changes result from some analysis of the textures. Also, the role of texture stages in this is left unclear (from my understanding of the article), and it isn't even clear to me whether texture stages greater than 0 are upgraded to brilinear from bilinear in control panel AF (this would support at least that there is some general case analysis). The way the difference method is described, it sounds like it would respond to the slightest change in any texture layer that has any more than "zero" impact on the output color the same as that change in all texture layers. From this, a colored mip map level could "erase" differences in color that might be there in other texture layers by pushing them into bits lost due to error (but perhaps not, or less often, if it was applied like the "transparent" option in the D3D AF tester in a program).

Given the difference to 9800XT, and the benchmark results, this, however, seems irrelevant to whether the X800 is 1) getting a performance benefit from this ocurring with Application Preference selected 2) producing something different....it is just relevant to removing doubt about what the differences are doing to image quality.

Where the article fails, and what Wavey seems to be discussing I think, is in evaluating the image quality significance of the pixels marked by the difference methodology they are using. The problem with the difference method used (as I understand it) is that the magnitude of difference isn't represented at all. Some extra work could have clarified this fairly simply, like, as one example, comparing known "all brilinear" to "what we believe is brilinear" with the same compare method. This would have ruled out something like a significant departure in methodology whose differences might be misrepresented by the compare method (i.e., things like fast trilinear and "gamma adjusted" edge sampling can achieve better or equivalent image quality while being different). It would be nice if Wavey could have access to the tool to evaluate this.

However, this does not make the article necessarily erroneous and misrepresentative, like some prior "cheating" articles have been, because it would require the above significant departure from expected methodology for the differences shown to be something different than what the article proposes. ATI hasn't explained such a departure in methodology, so the article seems quite reasonable, as does the conclusion of cheating. All it means is that we don't have the clarity we could have had at the moment.
 
Compressonator Screen Caps 2.8MB

OK, the above link shows some screen grabs of compressonator images - there are X800 XT, 9800 XT and a 5950 for comparison. In all cases bilinear is on the left, trilinear on the right and the middle shows a 400% brightness increase. App is our standard SS:SE texture test using the DX renderer.
 
DaveBaumann said:
Compressonator Screen Caps 2.8MB

OK, the above link shows some screen grabs of compressonator images - there are X800 XT, 9800 XT and a 5950 for comparison. In all cases bilinear is on the left, trilinear on the right and the middle shows a 400% brightness increase. App is our standard SS:SE texture test using the DX renderer.
Thx. You can clearly see that there are more black surfaces on the RX800 than on the R9800.
 
My issue is that even though there are more black surfaces on the x800 in comparison to the 9800, the actual game images look pretty much the same with the x800 edging out the 9800 a bit? So do the optimizations actually help image quality or am I blind :(?
 
gordon said:
My issue is that even though there are more black surfaces on the x800 in comparison to the 9800, the actual game images look pretty much the same with the x800 edging out the 9800 a bit? So do they actually help image quality?
It is not easy to notice the differences on a screenshot. You can notice them during gameplay. In motion the differences are far more noticable.
 
gordon said:
My issue is that even though there are more black surfaces between the x800 and 9800, the actual game images look pretty much the same with the x800 edging out the 9800 a bit?

You can't evaluate the differences from trilinear filtering well in still screenshots, even with the higher texture detail in evidence with current games. This is how nVidia's brilinear was used to misrepresent so often and with so much distortion.

To "witness" a lack of difference, you need consistent behavior and a tool that highlights it, or an evaluation "in motion". In order to conclude "no difference" in the "in game image", you'd need to have a highly detailed video to make the evaluation at all relevant to what trilinear is supposed to offer.
 
gordon said:
My issue is that even though there are more black surfaces on the x800 in comparison to the 9800, the actual game images look pretty much the same with the x800 edging out the 9800 a bit? So do the optimizations actually help image quality or am I blind :(?

I do not know, but it seems, like it depends, what you want to see. Because of the lack of interpolation with a lower-resolution Mip from "farer behind", some areas can appear a little sharper than with correct trilinear AF - as was the case on the brilinear-drivers on GFFX - but that isn't the idea behind texture-filtering or you could revert to point-sampling. ;)
 
DaveBaumann said:
Compressonator Screen Caps 2.8MB

OK, the above link shows some screen grabs of compressonator images - there are X800 XT, 9800 XT and a 5950 for comparison. In all cases bilinear is on the left, trilinear on the right and the middle shows a 400% brightness increase. App is our standard SS:SE texture test using the DX renderer.

Interesting, the X800 seems to fall between the 5950 and the 9800XT. So the X800 is performing a more refined brilinear, or something else?
 
gordon said:
So do the optimizations actually help image quality or am I blind :(?

I think that most people misunderstand what is the benefit of filtering. That's all. It's definitely not as simple as "comparing colored mipmap"/ "comparing crispiness" of the image as some reviewers seems to imply it etc..

Now, if that proves true that could seem like
- a justification of brilinear option opted by nv
- but a strange question arise is why ati does need that on the platform that has the "fastest aniso" on earth ? Did they think that comparing a full trilinear vs nv brilinear would be an unfair comparison but didn't think it would be ok to be fair and clear about it ?
 
Is it me, or the 9800xt shots is taken with 16bit color depth or texture depth? From the Compressonator Screen Caps. :?:
 
DaveBaumann said:
Compressonator Screen Caps 2.8MB

OK, the above link shows some screen grabs of compressonator images - there are X800 XT, 9800 XT and a 5950 for comparison. In all cases bilinear is on the left, trilinear on the right and the middle shows a 400% brightness increase. App is our standard SS:SE texture test using the DX renderer.

Have you reached a conclusion yet?
 
DSC said:
Is it me, or the 9800xt shots is taken with 16bit color depth or texture depth? From the Compressonator Screen Caps. :?:

I doubt DaveB would make such a grand error in a comparison like this.
 
Well, there´s another interresting point. NV40 is loosing performance with colored mipmaps in UT2004 as well:

8x Aniso (set in control panel for both)

NV40U (Normal/ColoredMips)
1024x768: 103,81 / 103,77
1280x1024: 95,04 / 91,45
1600x1200: 74,99 / 71,13

R420XT
1024x768: 105,66 / 105,73
1280x1024: 105,07 / 105,13
1600x1200: 103,61 / 95,93

Keep in mind that the test is CPU limited at lower resolutions. I was using v60.72 for NV40, with TrilOpt on. So NV40 is loosing even when using Brilinear!

I asked a guy at Epic on what they are doing when firstcoloredmip 1 is used:

all the engine does when you use colored miplevels is replace the color data. Texture format, dimension and all other properties remain unchanged so you shouldn't see a change in performance if everything is handled according to the specification.

You might want to talk to NVIDIA and ATI about this directly :)


Lars - THG
 
If someone answered this already, please forgive me - does this seem to occur using Unreal Tournament 2004 as well as 2003? If no one has checked, then could the same test be done against UT2004 for completeness' sake?
 
Quasar said:
Have you reached a conclusion yet?

It certainly looks fishy. I'm keen to hear ATI's response - is it:

a.) "Differences between how R420/RV3x0 handles mip map transistions in comparison to R3x0"
b.) "An error in the R420 driver path" - "Its new, we got the settings mixed up" / "Its confused between RV3x0's texturing abilities".
c.) "A bug" *cough*

As for the coloured mipmap performance differences - I do believe we've had an explaination on why that isn't necessarily a like for like test somewhere around here before.
 
Back
Top