ATI Filter tricks at 3Dcenter

Hellbinder said:
Supersample or no with both cards set to "max" the Radeon card looks better at *least* 90% of the time. I think i am being generous here. Better AF with deeper penetration into the frame.
I think you can find a lot of people who think NVidia 8xAF in combination with supersampling looks better than ATI 16xAF. I do, at least if there's no "brilinear" optimization.
 
andypski said:
If I recall correctly I have seen examples of applications that when setting up anisotropic filtering only set the MIN filter to anisotropic, while leaving the MAG filter as Linear. When this happens you will get significant aliasing at the point where the hardware switches from minification to magnification. It seems that some hardware always sets the MAG filter to anisotropic when the MIN filter is anisotropic.

It might be that applications that do this are the source of some reports of aliasing.

Interesting. I noticed texture shimmer with the Tiger Woods 2003 demo as well, IIRC, but only on textures at certain angles, and nowhere near as prevalent as in BF1942.

---

Saw this Quadro FX 1100 PR at NVNews:
• Industry-leading 12 bits of subpixel precision helps to ensure high geometric accuracy and the elimination of sparkles, cracks, and other rasterization anomalies.
I wonder if this particular item was instigated by the 3DC article, or if nV has touted this feature before? Sorry for my lazy intellect WRT this topic, but is this even the same issue investigated by the 3DC article in question (as I noticed the nV hardware there offered 8-bit precision)? Is is possible for nV to offer 8-bit on the consumer FX line, and up that to 12-bit on their Quadro FX line?
 
HB,

"I am a little baffled by the "Max IQ" not worth the difference in performance comment"

When you compare the MAX IQ of Nvidia to ATI and then compare the performance both offer at max IQ, it is quite clear that the performance lost by nVidia compared to ATi is not worth the miniscule gain in IQ that you get with the Nvidia card.

Sorry should have made that a little more clear.

Calnoth,

Please do us all a favor and actually read about what the R3X0 can do before you even dare to tell me what it cannot do...THAKS FOR YOUR COOPERATION.

ATi DOES INDEED Offer trilinear on ALL texture stages if an application askes for it. Unlike your lovechild Nvidia, ATi does do what the application asks instead of doing what they want to increase performance.

Nvidia's NV3x has already shown us what their DX9 performance is like BEFORE NVIDIA GETS A CHANCE TO CHEAT AND SUBSTITUTE CODE..

3dmark accurately predicted the nV3X lackluster performance long before any DX 9 titles came out, then TRAOD and HL2 just confirmed it.....

Why is it that ATI also has a DX9 runtime compiler and it did not break when futuremark patched their benchmark?
 
Xmas said:
Hellbinder said:
Supersample or no with both cards set to "max" the Radeon card looks better at *least* 90% of the time. I think i am being generous here. Better AF with deeper penetration into the frame.
I think you can find a lot of people who think NVidia 8xAF in combination with supersampling looks better than ATI 16xAF. I do, at least if there's no "brilinear" optimization.
Actually I find that pretty hard to believe. Especially if people were put on the spot in a public forum and forced to be Honest about it.

Point one.

Ati's AF at 16x has much deeper coverage than Nvidias. (not withstanding the rare cases where the angle issue comes to bare)

Point Two

Even Nvidias Highest AA mode displayes more Jagged edges on Horazontal and "around" Norazontal angles than Ati's 4x FSAA mode.

Point three

Nvidia cuts corners on their AF in application mode even on Primary angles resulting at times in noticable banding.

No more points.

Those are the facts and the Truth. People can complain all day about Ati's adaptive AF *BUT* it does not affect the primary most noticable angles ever. Wheras what Nvidia is doing DOES. Even with their latest drivers. and no "application detection" to override AF tester is going to disprove what anyone can see for themselves.
 
Very subjective view:

I also think nVidia's anisotropy combined with super-sampled AA offers a cleaner texture( while moving) than ATI's anisotropy. I enjoy the option of a mixed AA sample and anisotropy from nVidia in a few titles but it still doesn't come close to matching the over-all IQ with ATI's x16 anisotropy and sparse grid multi-sampling in most titles.
 
I remember when nVIDIA's NV20 came out brand new and there were no optimisations whatsoever on AF. You can enable 8xAF and trilinear filtering and you will get a near perfect picture, similar to how my R300 looks under 16xAF brilinear[quality].

You'd get about the exact same IQ, only difference is that Ati's 16xAF brilinear pushes mipmap boundaries further IIRC.
In games like HL NV's unoptimised 8xAF+trilinear was enough.
 
Pete said:
Saw this Quadro FX 1100 PR at NVNews:
? Industry-leading 12 bits of subpixel precision helps to ensure high geometric accuracy and the elimination of sparkles, cracks, and other rasterization anomalies.
I wonder if this particular item was instigated by the 3DC article, or if nV has touted this feature before? Sorry for my lazy intellect WRT this topic, but is this even the same issue investigated by the 3DC article in question (as I noticed the nV hardware there offered 8-bit precision)? Is is possible for nV to offer 8-bit on the consumer FX line, and up that to 12-bit on their Quadro FX line?
That's a different topic. Subpixel precision is the precision used when rasterizing a triangle and generating the interpolation values.


Hellbinder said:
Point one.

Ati's AF at 16x has much deeper coverage than Nvidias. (not withstanding the rare cases where the angle issue comes to bare)
The rare cases? Are you kidding?

Point Two
Even Nvidias Highest AA mode displayes more Jagged edges on Horazontal and "around" Norazontal angles than Ati's 4x FSAA mode.
We were talking about AF. At least I was.

Point three

Nvidia cuts corners on their AF in application mode even on Primary angles resulting at times in noticable banding.
As I said, 'if there's no "brilinear" optimization'
 
dan2097 said:
I think your wrong. Application preference does what the application asks for on ATI cards and if the program requests trilinear af on everything it gets it on everything. That only on first texture stage thing is only on quality af (performance af does bilinear on the first texture stage as well)
Hrm, sorry, I was very certain that it did this even when I set the anisotropic degree within the D3DAFTester program that I used to discern this. This doesn't seem to be the case right now, so this may have been a recent change in the drivers.

But still, forcing anisotropic, which is the only way to get anisotropic in most games, forces the use of bilinear (even if trilinear filtering is selected by the program) for all texture stages but the base texture.
 
Chalnoth said:
dan2097 said:
I think your wrong. Application preference does what the application asks for on ATI cards and if the program requests trilinear af on everything it gets it on everything. That only on first texture stage thing is only on quality af (performance af does bilinear on the first texture stage as well)
Hrm, sorry, I was very certain that it did this even when I set the anisotropic degree within the D3DAFTester program that I used to discern this. This doesn't seem to be the case right now, so this may have been a recent change in the drivers.
The drivers never behaved as you claim in "Application Preference". Likely you had the control panel set to "Quality" or "Performance" aniso.
 
YeuEmMaiMai said:
ATi DOES INDEED Offer trilinear on ALL texture stages if an application askes for it. Unlike your lovechild Nvidia, ATi does do what the application asks instead of doing what they want to increase performance.

If you use application preference on gfFX cards there is no texture stage optimization either with 52.xx and up.
 
StealthHawk said:
YeuEmMaiMai said:
ATi DOES INDEED Offer trilinear on ALL texture stages if an application askes for it. Unlike your lovechild Nvidia, ATi does do what the application asks instead of doing what they want to increase performance.

If you use application preference on gfFX cards there is no texture stage optimization either with 52.xx and up.

That's odd - I could have sworn 51.XX and up forced the pseudo-trilinear filter on all Direct3D applications.... (apart from the AF tester app that was app detected to work correctly).
 
Read Stealthy's Insane Filtering Analysis Again.

When the driver is set to let the application decide what AF to use, there will be no texture stage optimizations by default. You can see that in all three Intellisample modes the quality of each texture stage is the same. The trilinear filtering done in Quality mode is much worse than IQ we are used to seeing from Quality mode. In fact, the IQ in 52.13's Quality mode is only slightly better than that of the optimized "ut2003" filtering as will be illustrated better later. Performance and High Performance are identical to previous Performance and High Performance modes; there are no changes in these settings.
Seems to be talking about AF rather than just brilinear.
 
jimbob0i0 said:
StealthHawk said:
YeuEmMaiMai said:
ATi DOES INDEED Offer trilinear on ALL texture stages if an application askes for it. Unlike your lovechild Nvidia, ATi does do what the application asks instead of doing what they want to increase performance.

If you use application preference on gfFX cards there is no texture stage optimization either with 52.xx and up.

That's odd - I could have sworn 51.XX and up forced the pseudo-trilinear filter on all Direct3D applications.... (apart from the AF tester app that was app detected to work correctly).

Texture stage optimizations and brilinear filtering are two different things.

In D3D, you are completely right. Brilinear is the best filtering quality you can get.

The AF tester is not being detected. It shows brilinear. And it shows texture stage optimizations when AF is forced in the control panel. If you take a screenshot in UT2003 with AF set to application it will have much better IQ(depending on scene) than when AF is forced in the control panel and texture stage optimizations are present.

The most interesting thing is that something is going on with colored mipmaps, so that when AF is forced in the control panel the colored mipmaps show "full quality" and look the same as when texture stage optimizations are not present. In other words, whether AF is set to application or forced on, the colored mipmaps look the same. Whether this is due to NVIDIA detecting UT2003's colored mipmaps or due to a change in filtering in the 52.xx drivers is a lingering question. It seems extremely silly for NVIDIA to take the time to detect the colored mipmaps though, since you can just compare the actual IQ on screen with colored mipmaps off and tell the difference easily :)
 
StealthHawk said:
The most interesting thing is that something is going on with colored mipmaps, so that when AF is forced in the control panel the colored mipmaps show "full quality" and look the same as when texture stage optimizations are not present. In other words, whether AF is set to application or forced on, the colored mipmaps look the same. Whether this is due to NVIDIA detecting UT2003's colored mipmaps or due to a change in filtering in the 52.xx drivers is a lingering question. It seems extremely silly for NVIDIA to take the time to detect the colored mipmaps though, since you can just compare the actual IQ on screen with colored mipmaps off and tell the difference easily :)

yep.. check this video. The quest what happened under the colored mipmaps....

ut2003_aniso1.avi 3.1MB - 800x600 - xvid
 
StealthHawk said:
The most interesting thing is that something is going on with colored mipmaps, so that when AF is forced in the control panel the colored mipmaps show "full quality" and look the same as when texture stage optimizations are not present. In other words, whether AF is set to application or forced on, the colored mipmaps look the same. Whether this is due to NVIDIA detecting UT2003's colored mipmaps or due to a change in filtering in the 52.xx drivers is a lingering question. It seems extremely silly for NVIDIA to take the time to detect the colored mipmaps though, since you can just compare the actual IQ on screen with colored mipmaps off and tell the difference easily :)
It's more likely that nVidia simply does not do anything to the LOD of the base texture (which would make sense). Colored mipmaps usually only color the mipmaps of the base texture.
 
Here is my small analysis on the angle dependancy of ATi's Anisotropic filtering. You might fid it amusing :D

http://www.testiryhma.net/art_anisovertailu.html

Article is in finnish but i guess the images are pretty understandable. (though bear in mind I use animated .gifs there so amount of color affects mip colors and that should be discounted. More colorful .jpg images are linked in the end.)

My main point here being this image pretty much:
kolmiocolor.gif



I find it disturbing I get only 4x aniso for walls when I want 16x aniso :(
 
Well, goddamn three and two thirds aniso for walls then... doesn't make it any less disturbing though :D
 
Mendel said:
Well, goddamn three and two thirds aniso for walls then... doesn't make it any less disturbing though :D
no, but honestly, please find me many triangular hallways in FPS games....
A square/rectangular hallway would be a much more common experience.
 
Back
Top