ATi is ch**t**g in Filtering

Grestorn said:
jvd said:
that is exactly why people are turning it off. To prove that the optimizations are making the shimmering worse
If you're implying that the FarCry video of the x800 are made with AF off, you're dead wrong.

Another effect of 1xAF (no AF) is that textures become blurry in the distance (mipmap filtering). A very obvious effect which should prove that neither of the videos were created with AF off.
I never claimed u were now was i
 
Grestorn said:
jvd said:
Grestorn said:
jvd said:
I never claimed u were now was i
good... :)

now if you'd send me the damn save :) since the hacks keep breaking farcry (Btw check out wow on a 9700pro , x800pro no aniso and then x800pro with aniso)
All right, I'll send it to you. Give me a couple of hours, till I'm back at home, k?
take all the time u want . ITs 6:40 am and still haven't gone to bed :)
 
Chalnoth said:
It doesn't take "super eyes." It does take a crisp display. If you're used to seeing the problem, of course, you may not notice it. After having played games for so long with relatively poor visual quality, we learn to forgive a heck of a lot of graphical glitches.

i dunno . not many people are seeing problems

From toms review

A close look at the images shows that the Radeon X800 enjoys a small advantage over the Radeon 9800XT. The results achieved with the GeForce 6800 Ultra are better still in this special case (visible in the white speck).
anistropic filtering


and again
Aside from the GeForce 5950 Ultra, all cards offer similar image quality



http://www6.tomshardware.com/graphic/20040504/ati-x800-33.html

With standard trilinear filtering selected, ATi's X800 sticks very close to Refrast. The image produced by NVIDIA's GeForce 6800 Ultra clearly shows that the card uses a different LOD calculation at 45-degree angles. You can see it by looking at the indented ellipse. Imagine a perfectly round tunnel, and the ideal LOD pattern would be a perfectly round circle, so even Refrast is not correct here.
on color mip maps

trilnear filtering in black and white

here is where it gets tricky

Looking at the textures, the effects are much less noticeable than the colored mipmap comparison might suggest. Nonetheless we can see the differences in the GF 6800 Ultra's image. The interesting thing is that X800 and Refrast look better on the screenshots in a head to head comparison. This is because of the higher level Mipmaps at 45-degree angles. Although this looks better on screenshots, it can cause sparkle in motion.

Here´s what Microsoft has to say about it:

Quote: "The DX9 reference rasterizer does not produce an ideal result for level-of-detail computation for isotropic filtering, the algorithm used in NV40 produces a higher quality result. Remember, our API is constantly evolving as is graphics hardware, we will continue to improve all aspects of our API including the reference rasterizer."
so it seems that here the 6800ultra idsplayes a better result than the reference rasterizer but ati matchs the rasterier.

When anisotropic filtering is selected, both NVIDIA's GeForce 6800 Ultra and ATi's Radeon X800 XT produce an image that greatly differs from the one produced by Refrast. This is a direct result of the aggressive adaptive angle-based optimization. You can see that the filter of the GeForce 6800 Ultra (without trilinear optimization) is obviously less exact around horizontal plane, while ATi's X800 optimizes at the sides of the tunnel.
8x aniso color mip maps .

Since the Radeon X800 and the MS reference rasterizer show better results in our screenshots, it´s hard to understand why the GeForce 6800 offers better image quality, or let´s say it renders more correctly - as Microsoft confirms. However, in motion, the 45-degree deflection can cause texture sparkles in a moving image due to the higher Mipmap levels in that area - since they are only used in that area!

eek so it looks like the 45-degree deflection on the 6800s can cause sparkles ?

Let the witch hunts begin :rolleyes:


i actually thought it was a good review from toms.

but that is just one of many sites that hasn't found a problem with ati's image quality in games .

but as i said . we are going to be in for alot of witch hunts .
 
If you're just looking at screenshots, ATI's filtering problems typically won't show up at all (unless you know exactly what to look for and the screenshot was taken at the proper position). The problems are much more visible in motion. So, to put it bluntly, I just don't care what reviewers say, because they talk about how they compared screenshots. I never hear any talk about bothering to look for texture aliasing.

And no, anisotropic filtering doesn't eliminate the problem. It just reduces it, which means that the number of situations in which the problem occurs is reduced. It's not eliminated, and I still notice it.
 
Chalnoth said:
If you're just looking at screenshots, ATI's filtering problems typically won't show up at all (unless you know exactly what to look for and the screenshot was taken at the proper position). The problems are much more visible in motion. So, to put it bluntly, I just don't care what reviewers say, because they talk about how they compared screenshots. I never hear any talk about bothering to look for texture aliasing.

And no, anisotropic filtering doesn't eliminate the problem. It just reduces it, which means that the number of situations in which the problem occurs is reduced. It's not eliminated, and I still notice it.

and what is your take on the x800s image quality ?

How about the fact that nvidia just recently reduced its 8x aa quality ?
 
Chalnoth said:
It doesn't take "super eyes." It does take a crisp display. If you're used to seeing the problem, of course, you may not notice it. After having played games for so long with relatively poor visual quality, we learn to forgive a heck of a lot of graphical glitches.

My monitor displays my image quite clearly and sharply. I always update my video card frequently and I always play with my cards high details.

It seems you are playing games at a 320x240 res with 256 colors by the look of it.
Either that or you're simply superman.
 
jvd said:
here is where it gets tricky

THG said:
Looking at the textures, the effects are much less noticeable than the colored mipmap comparison might suggest. Nonetheless we can see the differences in the GF 6800 Ultra's image. The interesting thing is that X800 and Refrast look better on the screenshots in a head to head comparison. This is because of the higher level Mipmaps at 45-degree angles. Although this looks better on screenshots, it can cause sparkle in motion.

Here´s what Microsoft has to say about it:

Quote: "The DX9 reference rasterizer does not produce an ideal result for level-of-detail computation for isotropic filtering, the algorithm used in NV40 produces a higher quality result. Remember, our API is constantly evolving as is graphics hardware, we will continue to improve all aspects of our API including the reference rasterizer."
so it seems that here the 6800ultra idsplayes a better result than the reference rasterizer but ati matchs the rasterier.

The really tricky thing here is that the Microsoft quote used here by THG in the context of its ATi-nVidia IQ comparison has absolutely nothing to do with any ATi-nVidia IQ comparison. The unattributed quote from Microsoft used by THG was cut & pasted from a Tech Report article in which "Microsoft" was quoted, but only in the context of comparing nVx drivers to each other by way of the DX rasterizer IQ differences between the nV3x and nV4x hardware generations (TR also failed to attribute the M$ employee who furnished them with the quote.)

"Microsoft" was talking in the TR article about the fact that "our API is constantly evolving as is graphics hardware, [and] we will continue to improve all aspects of our API including the reference rasterizer," which clearly says to me that M$ has yet to update the DX rasterizer to reflect the capabilities of all the new hardware, not simply nVidia's. In short, regardless of the context THG used for this quote, what "Microsoft" was talking about had nothing to do with what THG was talking about. (We covered this in another thread somewhere, with links to the source of the THG "Microsoft" quote at TR, but there are so many of these danged "filtering" threads that I'm darned if I can remember which one the info is in.)

Good general advice for reading all printed articles, regardless of the subject or the media, is to be very leery of direct quotes attributed generally to corporate shells which, unfortunately, cannot speak. Anytime a person reads a direct quote like "nVidia says," or "ATi says" or "Intel says," etc., without specific attribution to the human being who actually provided the quote (name, job title) this should raise a giant red flag in the mind of the reader. All such quotes could well be entirely fabricated, or else used out of context (as is the case here.) Despite THG's assertion of "Here´s what Microsoft has to say about it," Microsoft actually said nothing at all about the "it" the THG article is concerned with (ATi-nVidia IQ, 45-degree mipmaps, or "sparkling"--Heh...;) Whatever that is supposed to be.)
 
jvd said:
and what is your take on the x800s image quality ?
Disappointing. No improvement over the R3xx, and a possible reduction in image quality that the user cannot turn off.

How about the fact that nvidia just recently reduced its 8x aa quality ?
Um. It's a different AA algorithm. I really just don't care.

It is, however, disappointing that the NV40 now has an angle-dependent LOD selection approximation, and I really hate all the idiots who went around proclaiming that screenshots which were taken in simple scenarios are representative of the majority of gaming situations. They're not. Image quality screenshots are typically taken in simple scenarios that don't show the problems of the angle-dependent anisotropy, and so all the reviewers claimed it's a great "optimization." Thanks a lot.
 
Chalnoth said:
jvd said:
and what is your take on the x800s image quality ?
Disappointing. No improvement over the R3xx, and a possible reduction in image quality that the user cannot turn off.

How about the fact that nvidia just recently reduced its 8x aa quality ?
Um. It's a different AA algorithm. I really just don't care.

It is, however, disappointing that the NV40 now has an angle-dependent LOD selection approximation, and I really hate all the idiots who went around proclaiming that screenshots which were taken in simple scenarios are representative of the majority of gaming situations. They're not. Image quality screenshots are typically taken in simple scenarios that don't show the problems of the angle-dependent anisotropy, and so all the reviewers claimed it's a great "optimization." Thanks a lot.

LOL. Different AA algorithm. Some people never learn. ;)

Anyway, this generation will be over before you know it. Mark my words. :p
 
OK, guys, i have GF6800GT now. It shows the SAME noise problems in MP2 as X800Pro (and no, they don't get any better if you switch optimizations off).

It's really a pity that i'm getting best image quality on GF4Ti, a card released two years ago... :( I think it's just plain bad for us all...
 
Which driver are you using? The ForceWare 61.11 are known to have a non-working "Trilinear Optimization"-Switch.
 
DegustatoR said:
OK, guys, i have GF6800GT now. It shows the SAME noise problems in MP2 as X800Pro (and no, they don't get any better if you switch optimizations off).

It's really a pity that i'm getting best image quality on GF4Ti, a card released two years ago... :( I think it's just plain bad for us all...

where did u find a gt ?

anyway it could just be a problem with the game
 
Bjorn said:
jvd said:
anyway it could just be a problem with the game

It could. But why doesn't the GF4 have these issues then ?

I don't own the game so i can't really say. But perhaps its a feature that can't be used on the geforce 4 ? or mabye its doing something to increase perfromance that gets broken after a certian fps ?

Anyway he doesn't make it clear if the gt is using brilinear or trilinear. If its using tri then it has to be the engine. IF its bri then it is most likely the filtering
 
jvd said:
But perhaps its a feature that can't be used on the geforce 4 ? or mabye its doing something to increase perfromance that gets broken after a certian fps ?

I'm not sure that i understand. Do you mean that it lacks all the "be able to lower the IQ" features that the new cards has ?
 
Bjorn said:
jvd said:
But perhaps its a feature that can't be used on the geforce 4 ? or mabye its doing something to increase perfromance that gets broken after a certian fps ?

I'm not sure that i understand. Do you mean that it lacks all the "be able to lower the IQ" features that the new cards has ?

No i'm sure that nvidia could add it to the geforce 4 if they wanted too.

If the geforce 6800 gt is running with full trilinear then its not an issue of lowering iq. Its an issue with the game .
 
Brilinear requires additional logic transistors in the TMU - not even nVidia could do this in software, AFAIK.

If GF4 does not exhibit this behaviour of producing artifacts, one can assume, that
a) it is not related to the game itself
b) reasons have to be looked for either in the rendering path the game uses or in the texture filters applied.
 
Back
Top