ATI Filter tricks at 3Dcenter

My only suggestion is to put down the brightness and contrast of your monitor. I really don't see how you can spot texture aliasing as easy as you guys say you do.

For all I kow all you see is this small part of the game which has shimmering in 1 spot at a certain angle and you are all making it out as something really massive.

If I wanted to go out and search for texture aliasing I would disable AF and trilinear filtering and decrease my LOD to an insane amount and knock the brightness of my monitor and in game gamma and control panel gamme to the max possible and I'm sure I can find lots texture aliasing all over the place too.
 
andypski wrote:
I dumped out the shot from patch 340 on a 5200Ultra and it looked like the R300 and Refrast image, I dumped it out from patch 320 and it seems to look like the 5800Ultra image that you linked to. (skenegroupin kuva) This is using driver version 52.16. If someone else could confirm this then that would be good - I think you might need to use a 5200/5600/5800 and not a 5700/5900/5950.

If you get that image with new forceware drivers AND if the difference is indeed caused by static clip plane cutting per pixel lights...

Doesn't that mean that either:

A) Static clip planes were never wholly removed. They still exist for at least 5200 series but they are defeated by futuremark patch.

B) Static clip planes were removed at some point but are now back, only defeated by futuremark patch.

Either way makes me puke.
 
Mendel said:
If you get that image with new forceware drivers AND if the difference is indeed caused by static clip plane cutting per pixel lights...
How could a clip plane possibly cut per pixel lights?
 
K.I.L.E.R said:
My only suggestion is to put down the brightness and contrast of your monitor. I really don't see how you can spot texture aliasing as easy as you guys say you do.
I set contrast and brightness to whatever fits my eyes, not to what hides artifacts best.

For all I kow all you see is this small part of the game which has shimmering in 1 spot at a certain angle and you are all making it out as something really massive.
I think the qualitiy of a texture filtering method is inversely proportional to the obtrusiveness of the artifacts it exhibits. Sometimes aliasing really stands out.
 
K.I.L.E.R, I think you have B3D confused with a gaming site. B3D is a 3D hardware site. So that you can't see it in most games is not the point. Some people, like me, still want to know these details to better understand the HW. If some people think they see shimmering, then they will want to know, from a hardware point of view, what could be causing that. Even if they might be wrong they should still be free to consider all possibilities.

I think it's a quite interesting subject actually, and it would be cool if someone wrote a program to automatically calculate these things (texture interpolation precision, that is).

I remember the first computer with 3d acceleration my parents bought many years ago. It had some ATI card. It only had 3 bit fraction for the bilinear lerps (if I remember correctly). Minimized textures still looked good, but enlarged textures was heavily banded. I believe the GS (in the PS2) also only have a very limited number of fractional bits for the bilinear lerps.

(Note to K.I.L.E.R: I'm not in any way comparing the r3xx with the GS or the early ATI card. They are clearly uncomparable in image quality. But as I said, I think the subject is interesting, and I'm just thinking out loud, okay?)
 
Xmas said:
How could a clip plane possibly cut per pixel lights?

I didn't come up with that idea but the idea about static clip planes is to cut out areas that are not visible and render only part of the scene, right? Well suppose that the light source is outside of the area that is being shown and calculated... Maybe Tertsi would like to explain this better if I'm wrong here?
 
Thowllly said:
I believe the GS (in the PS2) also only has a very limited number of fractional bits for the bilinear lerps.

This would seem to make sense. The worst example of this is in Ace Combat 4 (great game BTW) when the camera zooms in real close to the skybox and all you see are about 4-5 solid color bands running across the screen in background behind the plane. What's worse is that in the replays Namco coded in some type of pseudo-"motion blur" (by blending old frames with the current front buffer) that perhaps more than anything else defines the lower bound of possible IQ. One big splochy mess.

What's funny is that the PS2 has a 16 pipe design, more than any consumer card now on the market. :oops: They achieved this though shallow pipelining (2 cycles for texture lookups), low precision, and absence of blend ops. Yet all 16 need to work on one primitive at once, and on a system that can transform ~15m triangles/sec many triangles have actually less pixels than that so those pipes go to waste. I would love to, more than any consumer part, read a post-mortem on the design of the GS by the dev team. Just would like to know what they were thinking at the time.
 
you know it comes down to this very simple question

what would you rather have GFFX superior filtering at 10-20FPS in any DX9 game or ATI's trade offs that hardly impact IQ and give you excellent playability with IQ that is OVERALL BETTER than what GFFX offers at frames that are close but not equal to ATI...
 
YeuEmMaiMai said:
you know it comes down to this very simple question

what would you rather have GFFX superior filtering at 10-20FPS in any DX9 game or ATI's trade offs that hardly impact IQ and give you excellent playability with IQ that is OVERALL BETTER than what GFFX offers at frames that are close but not equal to ATI...

I would like to have both and above all I would like to have perfect understanding of inner workings of both IHV's hardware and their drivers :D

AND Whenever possible, I would like to nag about any issues in performance and or image quality, theoretical or practical.

As my primary gaming card of choice though, I have an ATi solution.
 
akira888 said:
Yet all 16 need to work on one primitive at once, and on a system that can transform ~15m triangles/sec many triangles have actually less pixels than that so those pipes go to waste.
That would be 8pipes(4x2) when texturing.
Either way, so long as the chip fills tris as fast or faster then it can setup them (or receive them from T&L) - which is almost always the case with GS - it makes no difference if there is waste on pixel pipes or not.
 
Fafalada said:
That would be 8pipes(4x2) when texturing.
Either way, so long as the chip fills tris as fast or faster then it can setup them (or receive them from T&L) - which is almost always the case with GS - it makes no difference if there is waste on pixel pipes or not.

As a practical matter for a developer certainly. I was just pondering why Sony designed a rasterizer with more pipes than utterly neccessary when a more efficent use of their GS silicon budget could have been used for functionality such as better texture formats than 4bpp or 8bpp LUT, or expanding the range of blend modes to encompass Dot3, CubeEnvMap, or EMBM.

Maybe I'm the stupid one, but I have never been able to understand what the design philosophy was behind the GS. :?
 
Mendel said:
Xmas said:
How could a clip plane possibly cut per pixel lights?

I didn't come up with that idea but the idea about static clip planes is to cut out areas that are not visible and render only part of the scene, right? Well suppose that the light source is outside of the area that is being shown and calculated... Maybe Tertsi would like to explain this better if I'm wrong here?
You cannon simply cut away a light source. You'd have to drop the shader parts that calculate the lighting.
 
Xmas said:
You cannon simply cut away a light source. You'd have to drop the shader parts that calculate the lighting.
Unless the application is doing multiple passes.

-FUDie
 
FUDie said:
Xmas said:
You cannon simply cut away a light source. You'd have to drop the shader parts that calculate the lighting.
Unless the application is doing multiple passes.

-FUDie
Then you have to drop one pass ;) But it's still not related to static clipping planes.
 
Xmas said:
FUDie said:
Xmas said:
You cannon simply cut away a light source. You'd have to drop the shader parts that calculate the lighting.
Unless the application is doing multiple passes.

-FUDie
Then you have to drop one pass ;) But it's still not related to static clipping planes.
well, wonders then why you didnt correct tertsi.....
 
Xmas said:
FUDie said:
Xmas said:
You cannon simply cut away a light source. You'd have to drop the shader parts that calculate the lighting.
Unless the application is doing multiple passes.
Then you have to drop one pass ;) But it's still not related to static clipping planes.
Sure it is. On one pass you render the whole thing, on a later pass your clipping plane (which wasn't enabled earlier) prevents some pixels from being updated. I'm not saying that's what's happening, but it's a possibility.

-FUDie
 
Althornin said:
well, wonders then why you didnt correct tertsi.....
You must be a blessed man, having the time to read and reply to every single post on this forum :D
 
FUDie said:
Sure it is. On one pass you render the whole thing, on a later pass your clipping plane (which wasn't enabled earlier) prevents some pixels from being updated. I'm not saying that's what's happening, but it's a possibility.

-FUDie
Hm, that is a possibility (with drawbacks), but I very much doubt it applies here. btw, I wouldn't call that static ;)
 
Althornin said:
Chalnoth said:
Aliasing is always much more visible in motion. With insufficient "LOD fraction" between MIP levels, there will always be more shimmering in motion on the R3xx.
I disagree. I think you have no idea what you are talking about. Offer some proof, not your usual half assed anti ATI assumptions, thanks.
What I'm talking about is not that hard to understand.

Point one:
Aliasing is always much more visible in motion. If you disagree with this, you have no idea what aliasing is. Granted, not all types of motion exaggerate aliasing, but my beef with aliasing has always been that occasional situation where it looks really bad (But with the R300, I've had games where if I forget to turn Anisotropic filtering on, there's aliasing all over the place, not just in a few specific areas).

Point two:
With insufficient "LOD fraction" between MIP levels, there will always be more shimmering in motion on the R3xx. The inaccuracies will simply manifest themselves as shimmering when in motion. Hard to explain, easy to visualize. First a pixel will be too bright. The texture moves slightly. Now it's too dim. That's shimmering for you.
 
I have found where aliasing and textue shimmering occurs in the real world.
Perhaps there is a glitch in the reality rendering engine? ;)
 
Back
Top