8500 banded with alpha blending and fog

Bambers

Regular
I've noticed since I've had this card that transparency looked banded even in 32bit colour so I've done some comparisions with a g400 :)D) in another machine.

For alpha blending this is the g400

alphag400.jpg


and this is the 8500

alpha8500.jpg
.

The smoke puff for the 8500 seems to stop rather abruptly.

In fog it also suffers. This is the 8500 and this is the g400.

This problem used to be quite a bit worse in the 7206/3864 drivers and improved to its current state with the first 60xx and 90xx drivers.

The quality of the video overlay also improved by the same amount between these drivers but even now DVDs are still blotchy in dark areas.

I was wondering if these problems are linked and are they on all ati cards or is it just the 8500?
 
A couple comments:

- Did you have texture compression enabled? The G400 doesn't support this feature, but the 8500 does. Using compression on one and not the other will lead to visual differences because there is a loss of texture quality. Make sure you're testing the same features on both cards.
- These screen shots are not identical. Can you do an experiment with _exactly_ the same data on each card? This would also help for comparison. One way to do this would be to create a demo of what you want show, then take screenshots at exactly the same place. Another way is to create an OpenGL wrapper (there are sample ones around... that's how people made the OpenGL wrapper cheats for games like Half-life =P ) and capture the data for the frame you want to show. Then you can play back this data on each card.
- Lastly, any chance you could improve the brightness with Photoshop or something? I can't see _any_ fog in either of the screenshots you gave links to.
 
Hmmm never really noticed any banding with my 8500 and Opengl / Winxp combo, then again I don't have Quake 3 installed anymore..here is a shot of Unreal Tournament and the smoke trail looks fine ??

Shot0020.jpg
 
Bambers said:
EvilEngine said:
I notice it too with my GF3 with 28.32 and Q3 version 1.31
Texture compression is disabled.

How does the fog look on a gf3?
fog.txt

It doesnt look as bad as the gun smoke we both notice, but quake skies and fog have always looked wierd even on a Gf2 & 3 and my Voodoo3 & 5. If these are "driver optimizations" we should be given the choice of either turning them on or off(ati and nvidia). This is almost as bad as previous beta DET4 drivers that forced any resolution over 1024x768x32 with any AA (1600x1200x32 with 2xAA for example) into a dithered 16bit color, glad to say doesnt happen in current drivers.
But having a little freedom to use "max quality mode" and "optimized benchmark mode" would be nice. Quake3 is the only game in my collection that does this by the way, and the gun smoke thing I think only happened recently... possibly early 27.+ drivers.
 
It seems to me that as well as finishing the transparent textures early the 8500 seems to use less colour depth than it should be.

Transparent textures used to be much more banded in an earlier driver version and the video overlay was also much more banded.
 
easyride said:
Could be a driver "optimization". Maybe they skip pixels that are almost completely transparent?

That's my thought, but I don't really see it improving speed any significant amount.

It could be that the second image is what it's supposed to look like, but previous cards just couldn't do function. I believe its a compare function on the alpha channel, which is why you get the hard edges. I know some games use it to get hard edges on leaves for a tree or other sprites without using a truckload of polygons.

Although this explanation seems a bit weird (why would the programmer, i.e. Carmack, want to have this effect?), but the cards don't have any problem with this in any other game. If NVidia is doing it in their latest drivers, I think it makes sense.
 
Mintmaster said:
That's my thought, but I don't really see it improving speed any significant amount.
Why not? This saves quite a bit of transparent overdraw.

It could be that the second image is what it's supposed to look like, but previous cards just couldn't do function. I believe its a compare function on the alpha channel, which is why you get the hard edges. I know some games use it to get hard edges on leaves for a tree or other sprites without using a truckload of polygons.
This comparison is called alpha test, and ANY 3d chip I know of is capable of this "free" operation. But it is the responsibility of the application to set this up correctly.
 
I'm of the opinion that the issue described here is one of driver bugs associated with dithering with alpha, and it's been broken and rebroken throughout the drivers on the 8500.

Direct3D is nice as it has not only the ability to enable/disable, but a choice of which dithering to use with alpha. In OGL, the only flag we have is to enable/disable, and appears to be broken in current drivers, at least with Quake3.

A good example is map Q3DM11. The light/lamp rays are choppy and ugly on the 8500. This is with TC disabled:
8500
q3-8500.txt


GF3
q3-gf3.txt


It wasn't always like this and just appears to be (yet another) registry/driver setting that is broken or otherwise being ignored at this time. The key is: OGLDisableDitherWhenAlphaBlending, and regardless of value, dithering is disabled, at least in Q3. LOD Bias and several others are also ignored, but instead hard-set to different values.

Cheers,
-Shark
 
Wow, I wasnt aware how crappy screenshots are for depicting this difference. Here's a pair where it might be a little easier to see. It's NOT this subtle on-screen side by side- and the "clumps" near the bottom of the light ray stick out quite visibly.

Look at the bottom edge of the alpha for the missing dithering:

8500
q3-8500-2.txt


GF3
q3-gf3-2.txt
 
Sharkfood: I am not so sure those shots are of equal settings. That GF3 shot has considerable less texture definition. Maybe that problems with the light are because the Radeons LOD bias is too aggressive.
 
Nothing can be done about the LOD bias. ATI has it locked on sharp ever since the Quack debacle. It's totally ignored in the drivers.

If I had time, I would go through drivers to show it's simply the dithering that is missing. What you are seeing has *no* relation to LOD Bias. It's the big chunks in the alpha and it's much more apparent on-screen than in these screenshots.

In past Radeon drivers, these were dithered nicely- much like the GF3 does. It simply appears the "dither while alpha" setting is just another value in the registry the drivers are (now) ignoring and hopefully only a temporary issue.

I think the original poster brought this up as they noticed the difference. I did immediately as well. It's that obvious and no amount of LOD Bias is going to add dithering. :)
 
I just played UT (ogl) and I noticed that none of the problems occur there (smooth unbanded alpha textures and no abrupt cut off) so is this a game specific problem or is it just because UT is a dark game. :-?
 
Back
Top