ATI Filter tricks at 3Dcenter

Chalnoth said:
Point one:
Aliasing is always much more visible in motion. If you disagree with this, you have no idea what aliasing is. Granted, not all types of motion exaggerate aliasing, but my beef with aliasing has always been that occasional situation where it looks really bad (But with the R300, I've had games where if I forget to turn Anisotropic filtering on, there's aliasing all over the place, not just in a few specific areas).

Point two:
With insufficient "LOD fraction" between MIP levels, there will always be more shimmering in motion on the R3xx. The inaccuracies will simply manifest themselves as shimmering when in motion. Hard to explain, easy to visualize. First a pixel will be too bright. The texture moves slightly. Now it's too dim. That's shimmering for you.
point one: I wasnt disagreeing with this, thanks. (until the part where you begin injecting your personal and extremely biased experiences).
point two:
You didnt read what i said, did you? Restating over and over again that now it makes sense, this causes shimmering DOESNT MAKE IT SO.
PROVE IT.
 
I don't understand why people think this issue is "anti" anyone. All HW design is about tradeoffs -- making the right ones. I was rumaging through some boxes the other day and ran across my old C64 and Apple2 manuals. I'm still amazed at some of the tricks and tradeoffs employed by the designers of these systems to save costs.

From a purist point of view, they love zooming in on pixels and complaining about rendering artifacts in filtering, AF, AA, etc. But from ATI's point of view, there are marginal returns to IQ improvement and those transistors could be spent elsewhere that would have way more visual impact. In the future, when they have an increased transistor budget, and when 8-bit lerps are an even smaller fraction of the total transistors, they'll probably toss them in. But for now, they were probably trying to squeeze as much as possible on .15um and features like PS2.0 were simply more important than 8-bit lerp.
 
Althornin said:
You didnt read what i said, did you? Restating over and over again that now it makes sense, this causes shimmering DOESNT MAKE IT SO.
PROVE IT.
Yes, I did read it. You said I didn't know what I was talking about. Now you say I do. Interesting.

And, as I said, the impact of this is primarily going to be visible only in motion. If you want proof of the deficiencies of ATI's LOD fraction, you have to go to a synthetic scenario.

So, the best way to see this texture aliasing is to point out a place in-game. I believe I've stated many situations in the past where I've seen much more aliasing on the R300. A couple off the top of my head:

1. Everquest: When anisotropic is off, there is aliasing all over the place (which I notice because I sometimes forget to turn aniso back on after playing, say, Diablo II).
2. NWN: Some of the more regular textures exhibit far more aliasing on the R300 than they did on my Ti 4200 (with anisotropic, this time). One example is the stone texture used on many bridges.
 
I saw a huge amount of texture shimmer in the BF1942 demo with my Radeon 9100 at 10x7. I think an easy way to verify one card exhibits greater texture shimmer would be to walk around some of the wide, uniformly textured hills in the demo with an NV25, R(V)3x0, and NV3x.

I can't remember if the onboard GF2MX on my single-channel nF1 exhibited texture shimmer, but then I was running the demo at minimal detail at 640x480 just to get something approaching a playable framerate. My 9100 was at 10x7 with mostly high detail, IIRC.
 
If I recall correctly I have seen examples of applications that when setting up anisotropic filtering only set the MIN filter to anisotropic, while leaving the MAG filter as Linear. When this happens you will get significant aliasing at the point where the hardware switches from minification to magnification. It seems that some hardware always sets the MAG filter to anisotropic when the MIN filter is anisotropic.

It might be that applications that do this are the source of some reports of aliasing.
 
andypski said:
If I recall correctly I have seen examples of applications that when setting up anisotropic filtering only set the MIN filter to anisotropic, while leaving the MAG filter as Linear. When this happens you will get significant aliasing at the point where the hardware switches from minification to magnification. It seems that some hardware always sets the MAG filter to anisotropic when the MIN filter is anisotropic.

It might be that applications that do this are the source of some reports of aliasing.
This is certainly a problem with TRAOD.
 
Chalnoth said:
Yes, I did read it. You said I didn't know what I was talking about. Now you say I do. Interesting.
ummm...WTF ??? What are you going on about? I still think you are clueless :?
And, as I said, the impact of this is primarily going to be visible only in motion. If you want proof of the deficiencies of ATI's LOD fraction, you have to go to a synthetic scenario.

So, the best way to see this texture aliasing is to point out a place in-game. I believe I've stated many situations in the past where I've seen much more aliasing on the R300. A couple off the top of my head:

1. Everquest: When anisotropic is off, there is aliasing all over the place (which I notice because I sometimes forget to turn aniso back on after playing, say, Diablo II).
2. NWN: Some of the more regular textures exhibit far more aliasing on the R300 than they did on my Ti 4200 (with anisotropic, this time). One example is the stone texture used on many bridges.
and the point goes whizzing far over your head.
You have yet to prove/show/demonstrate that " insufficient "LOD fraction" between MIP levels" is the cause of any of this.
Again, please respond with more than your typical anti-ATI conjecture.
All of this goes back to this statement of yours (which you'd know if you read posts properly)
I had just always thought it had to do with the use of aggressive texture LOD, but this lack of precision for texture filtering makes more sense.
you jump onto an idea with nothing but conjecture behind it! I repeatedly ask for more than that, and get repeat ramblings about how "it just makes sense" PROVE IT.
 
From my own experience, the Radeon 9700 Pro exhibits more texture aliasing than my GeForce4 Ti 4200 did.

What more do you want?
 
Some screenshots and analysis to back up that claim?
That's what he means by PROVE IT.
I think you did post a few screenshots a while back but I got some serious aliasing when I used a Ti4200 as well in GP4. Do you remember that?
 
Chalnoth said:
From my own experience, the Radeon 9700 Pro exhibits more texture aliasing than my GeForce4 Ti 4200 did.

What more do you want?
You have a problem (aliasing). What's the cause? That's what Althornin is asking. So far you've done nothing to show that any of these effects are causing aliasing. Does the RefRast show texture aliasing? If so, then maybe you're onto something. If not, then you're wrong. There are other ways to isolate the problem as well.

-FUDie
 
this thread definately is entertaining....Calnoth why not just continue to use that overrated thing they call a 5950u and be happy? For the rest of you ATI cut corners and sucks crowd....consider this

If you really want to know who made the best design decisions just take a look at the IQ VS performance ratio of each card

ATi's offering is 2 to 4X faster than NV3X when you enable max IQ and the difference in IQ does NOT justify the difference in performance.

funny how you conviently overlook all of nVidia's short commings in their current hardware

32fp sure they can do it but it definately is a slide show
PS 2.0 sure they can do it but it definately is a slide show
FSAA why is it that nVidia's 4x looks like ATi's 2x?
ATi's 6x MSAA looks better than Nvidia's 8XSSAA (not to mention ATI is way faster at it)
Vortex power ATi is the CLEAR and UNDISPUTED leader here as well
design process ATi's .15u chips outperform and run cooler than Nvidia's .13u models thus allowing ATi to have a quieter cooling solution
Trilinear filtering at least ATI offers the user a choice....where as NVidia is so slow that they disabled it and substituted a inferior implimentation.

ATi has stated that if the application has the option built in and requests trilinear it will get it if the app does not have the option then you must use the control panel to get it.

Nvidia has basically said screw you....

The funniest thing though is that running Nvidia's dawn demo on ATI's hardware through a wrapper is FASTER than running it natively on Nvidia's hardware and all the while ATI is using HIGHER percision.

you want to agrue about the trade off's ati made and how nVidia does it better then you need to look at the WHOLE picture and see what nVIdia traded off on and ATI does better

Most importantly who's DX 9 hardware can actually run DX9 games without having to wait for the driver team to hand code shaders? answer ATi....NVidia will not be able to hand tune every game that comes out.......
 
YeuEmMaiMai said:
this thread definately is entertaining....Calnoth why not just continue to use that overrated thing they call a 5950u and be happy?
Um, because it's too expensive? Anyway, I am considering going back to nVidia soon...but I'd really like to try to wait for the NV40 to be released.

ATi's offering is 2 to 4X faster than NV3X when you enable max IQ and the difference in IQ does NOT justify the difference in performance.
Nice comparison, when ATI doesn't offer supersampling.

32fp sure they can do it but it definately is a slide show
32-bit FP is not intended for use in the entire shader, and it will definitely not make a slideshow out of games to use it where it is necessary.

PS 2.0 sure they can do it but it definately is a slide show
Um, no.

FSAA why is it that nVidia's 4x looks like ATi's 2x?
Yes, the AA of the NV3x is disappointing.

Vortex power ATi is the CLEAR and UNDISPUTED leader here as well design process ATi's .15u chips outperform and run cooler than Nvidia's .13u models
Cooler? It runs at a much lower clockspeed.

thus allowing ATi to have a quieter cooling solution
Trilinear filtering at least ATI offers the user a choice....where as NVidia is so slow that they disabled it and substituted a inferior implimentation.
ATI disabled trilinear filtering on all but the first texture stage.

ATi has stated that if the application has the option built in and requests trilinear it will get it
Except it doesn't, not for anything but the first texture stage.

Most importantly who's DX 9 hardware can actually run DX9 games without having to wait for the driver team to hand code shaders? answer ATi....NVidia will not be able to hand tune every game that comes out.......
And they won't hand-tune every game that comes out. That's why the improved compiler (that will continue to improve in the future) was released.
 
ATi has stated that if the application has the option built in and requests trilinear it will get it
Except it doesn't, not for anything but the first texture stage.

I think your wrong. Application preference does what the application asks for on ATI cards and if the program requests trilinear af on everything it gets it on everything. That only on first texture stage thing is only on quality af (performance af does bilinear on the first texture stage as well)[/quote]
 
Chalnoth said:
Nice comparison, when ATI doesn't offer supersampling.
Any application that wants to can do supersampling. In fact using current D3D interfaces I believe an application can even do mixed multi-sampling and supersampling if it wants. Simply render to a higher-resolution multisample buffer and then blit to your final destination. No problem.

We don't currently offer a forced supersampling control panel option, however.

Trilinear filtering at least ATI offers the user a choice....where as NVidia is so slow that they disabled it and substituted a inferior implimentation.
ATI disabled trilinear filtering on all but the first texture stage.
Wrong.

ATi has stated that if the application has the option built in and requests trilinear it will get it
Except it doesn't, not for anything but the first texture stage.
Wrong.

And they won't hand-tune every game that comes out. That's why the improved compiler (that will continue to improve in the future) was released.
I expect you're right - we won't see hand-tuning for every game. I expect an application would have to have some special significance to really warrant hand-tuning by an IHV...

...oh, and a benchmark mode of course. ;)
 
Chalnoth said:
From my own experience, the Radeon 9700 Pro exhibits more texture aliasing than my GeForce4 Ti 4200 did.

What more do you want?

There might be a slight chance that your Ti4200 didn't have the signal quality (to the monitor) to enable you to see any aliasing. 4200 were the worst GF4 Ti from a 2d quality P.O.V.

And all GF (1 to 4) looked like they were using one or two lower level mip-maps.
 
Chalnoth said:
K.I.L.E.R said:
Am I the only person with an R3xx card who plays every game with trilinear on every texture stage?
Do you use Rivatuner to enable this?

Chalnoth, I have yet to see any texture aliasing in any game I played over the last year with my R300. That may be because I use trilinear filtering on every texture stage. :D
Some people just aren't as sensitive to texture aliasing as I am. I see it all too frequently. Though I will have to admit that with some games, it's the fault of the game developer (some developers have this strange idea that setting a negative LOD globally is a good thing).
I think its more true to say that *some people* only have an eye for anything that can come up with, manufacture, inflate, exagerate, misrepresent, or invent with non Nvidia hardware. Specifically ATi hardware. While Stuffing Giant black balls into their Eyesockets when playing games with their Nvidia hardware.
 
DemoCoder said:
I don't understand why people think this issue is "anti" anyone. All HW design is about tradeoffs -- making the right ones. I was rumaging through some boxes the other day and ran across my old C64 and Apple2 manuals. I'm still amazed at some of the tricks and tradeoffs employed by the designers of these systems to save costs.

From a purist point of view, they love zooming in on pixels and complaining about rendering artifacts in filtering, AF, AA, etc. But from ATI's point of view, there are marginal returns to IQ improvement and those transistors could be spent elsewhere that would have way more visual impact. In the future, when they have an increased transistor budget, and when 8-bit lerps are an even smaller fraction of the total transistors, they'll probably toss them in. But for now, they were probably trying to squeeze as much as possible on .15um and features like PS2.0 were simply more important than 8-bit lerp.
Good post.
 
I am a little baffled by the "Max IQ" not worth the difference in performance comment.

Supersample or no with both cards set to "max" the Radeon card looks better at *least* 90% of the time. I think i am being generous here. Better AF with deeper penetration into the frame. Much better AA coverage with much smoother surfaces on more angles.

There are a couple of RARE cases where the Nvidia card will look better. However these usually only last for a Brief time even in games where they exsist. Like heavy Alpha Texture areas or the case of the odd outdoor angle issue.
 
John Reynolds said:
K.I.L.E.R said:
With my R300 I find it impossible to find any texture aliasing.

I have no problem finding texture aliasing with my 9800 Pro. Just depends on the game, the texture(s), and the angle.
I agree. At least with one specific game. The ever popular BF1942. That game is a Texture aliasing mess. I cant really think of another game where texture aliasing stands out.
 
Back
Top