ATi is ch**t**g in Filtering

DrawPrim said:
Your mip filter would be LESS aggressive because the textures are vastly different between mip levels.
Hmmmm... i don't know whether the chip is able to detect the differences between different mipmap levels. :?
 
Colourless said:
What I think is, yes, they are doing some 'brilinear' trick and on textures that it would show it they are doing standard trilinear. Now one thing i'm curious about is what exactly the threshold between using it and not using it is. Quite possibly it's not purely intended to hide what's going on with coloured mip levels, but could also be intended to be used with any texture that has mip levels that are quite different. Need some sort of filtering test application that lets you set an blending level for the coloured mip levels to see if you can notice the change.
I remember having read somewhen somewhere that either NVidia or ATI implemented such a dynamic "how does the texture look like? how different are the mip maps? what filtering do the textures need?" solution. But I might be wrong. It was quite some time ago.

Anyway, the idea is certainly not bad. But as with every optimization it should at least be officially documented - and even better would be a switch with which we could turn it off. Adding such a thing secretly and without an option to turn it off leaves quite a bad taste in my mouth.
 
Colourless said:
... but could also be intended to be used with any texture that has mip levels that are quite different.

Hmm. I suppose it would be easy to check. Just make the colour mipmaps more similar. Or instead of 'solid' colour levels, use different random levels of colour, e.g. instead of using solid red use random levels of red etc. That may fool any colour mip map detection code into not using trilinear.

Rob.
 
Exxtreme said:
DrawPrim said:
Your mip filter would be LESS aggressive because the textures are vastly different between mip levels.
Hmmmm... i don't know whether the chip is able to detect the differences between different mipmap levels. :?

No, but a driver sure could.
 
DrawPrim said:
No, but a driver sure could.
Hmm, there is no reason to detect the differences between the mipmaps. The differences are usually small. But if you want to detect if someone is using couloured mipmaps, in this case this detection is very useful.
 
madshi said:
I remember having read somewhen somewhere that either NVidia or ATI implemented such a dynamic "how does the texture look like? how different are the mip maps? what filtering do the textures need?" solution. ......Anyway, the idea is certainly not bad.
Its certainly very bad. Those decisions must be made by application developer or content developer, or by respective development tools.
 
no_way said:
madshi said:
I remember having read somewhen somewhere that either NVidia or ATI implemented such a dynamic "how does the texture look like? how different are the mip maps? what filtering do the textures need?" solution. ......Anyway, the idea is certainly not bad.
Its certainly very bad. Those decisions must be made by application developer or content developer, or by respective development tools.
That is False.

As most "applications developers" are not even caring about Anisotropic or other methods of Filtering or even have controls for it in their software. It is rare that developers even think down that road. Which is why Nvidia and ATi have control panel settings for AF in the first place.

Now in the case where the application is trying to use a specific filtering method in a predetermined way then yes the IHV should leave it alone again *IF* set to application in the control panel.

However I must reiterate that the idea that developers are sitting out there laboring over filtering levels, and Anisotropic, TRi, bi whatever is just false.
 
Colourless said:
Quite possibly it's not purely intended to hide what's going on with coloured mip levels, but could also be intended to be used with any texture that has mip levels that are quite different.

My preliminary understanding is that this is probably closest to the case. One may want to "think historical" as well.
 
L233 said:
AlphaWolf said:
L233 said:
Whether someone personally perceives IQ to be noticeably lower or not is largely inconsequential since most comparisons to the GF6800U have been done with brilinear disabled on the GF6800U.

Well if you look earlier in this thread, Lars from THG states that when color mip maps are used performance drops in UT2004 on the 6800U also.

Yep, but nowhere near 20%... and that's the point exactly.

Sorry, if someone did mention it already, but i have quite a few pages to catch up, since i was forced to work all day, but i disagree.

I disagree because it doesn't matter how successfully you cheat (note that this is not necessarily related to the current issue!) but rather the fact, that you cheat at all.

Same thing some time ago, when nV did it's 3DMark-thingy and also ATi was accused of doing some "application-specific" shader replacement (IIRC they (ATi) admitted it, which i respect very much) also. nV gained about 20-30% (don't remember now) and ATi a meager 2%.

Both were doing the same thing morally - but with differnces in succes.
 
PatrickL said:
I am also not very comfortable with that article during a weekend when they say they will send a pdf and ask to ATI later. I can understand if you ask, wait 24 h for an answer then publish whatever the answer is. But publishing before asking like if the website is unable to make mistakes sound not totally right.

Perfectly alright - your opinion.
But we had published one week ago a news item regarding exactly the same thing (colormips did "cripple" performance on a R9600XT in CoD) - but weren't calling it someting, just asking, what was going on.

No reaction.

We just expanded this matter and tried to find some proof for or against our suspicion.
 
Quasar said:
Same thing some time ago, when nV did it's 3DMark-thingy and also ATi was accused of doing some "application-specific" shader replacement (IIRC they (ATi) admitted it, which i respect very much) also. nV gained about 20-30% (don't remember now) and ATi a meager 2%.

Both were doing the same thing morally - but with differnces in succes.

Not necessarily - you are not taking into account exactly what they are replacing. For instance - in both these cases were they giving a 100% mathematical equivelent of the original shader?
 
DaveBaumann said:
Colourless said:
Quite possibly it's not purely intended to hide what's going on with coloured mip levels, but could also be intended to be used with any texture that has mip levels that are quite different.

My preliminary understanding is that this is probably closest to the case. One may want to "think historical" as well.
Can I quote you on that Dave? :|





















;)


rofl.gif
rofl.gif
rofl.gif


Sorry Dave, I just keep feeling bad for you whenever I see you quoted and causing a ruckus in spite of yourself. :LOL:

Me sympathies, take it as a compliment to your ability and knowledge though 'cause it is. 8)
 
I received an additional info from Epic:

xxx @ NVIDIA just pointed out that filling textures with a constant color/alpha might cause more pixels to be written to the framebuffer (passing alpha test) which could affect performance. Totally forgot about that one and I should probably change the code to leave the alpha channel alone for our next generation engine :)

This would explain why there´s always a performance hit, even with full bilinear and with NV40 as well.

Lars - THG
 
Quasar said:
1. But we had published one week ago a news item regarding exactly the same thing (colormips did "cripple" performance on a R9600XT in CoD) - but weren't calling it someting, just asking, what was going on.

2. No reaction.

3. We just expanded this matter and tried to find some proof for or against our suspicion.

1. Did you contact ATI directly, or just post a news item and hope someone would alert them to it?

2. Do you realize that ATI was represented at E3 this past week, therefore it is VERY likely that someone in a position of knowledge might not even be around to answer.

3. I think releasing something of this magnitude on a weekend is not in the best interests of anyone involved as it seems to have caused a great deal of commotion/speculation that might have been avoided if you had waited for an appropriate response. Then, if the response from ATI supports your news item, THEN paste it all over the internet if that is your intent. If I were with ATI, I would almost feel like I had been "set up" simply from a timing standpoint....
 
They should have explained their stand on the situation from the very beginning instead of waiting, as they have done, for the rumours to spread.
 
Evildeus said:
Yes of course, the question was more like, why did you make those comparisons? But i realise that's a bit dumb, if you found some differencies, sure you would investigate more and more to see what's going on. Sorry for asking for something so obvious :oops:

There are no dumb questions - only silly answers.
Initially what struck me were benchmarks, were both (X800 and nV40) were pretty close in normal-mode - with a huge lead for nV40 with AA and an even bigger advantage for ATi with AF on. With both turned on, the lead was on ATi's side despite the fact that popular belief was, that both were mainly bandwidth limited.
Then there was this PCGH-article where they measured a 20% increase of trilinear fillrate over it's theoretical peak with X800 (reproducible on both RV350/360 and R200) and of course the ColorMip-Thingy with CoD, which we noticed one week ago in our news item.

Then we started to dig.
 
Back
Top