More ATI Driver News from Derek Smart

Status
Not open for further replies.

antlers

Regular
I'm hoping one of our many valued ATI driver devs here can contradict his assertions about the drivers for 9000 and 9700.

1. Multi-texturing is broken in the current drivers (he gives screenshots as an example of a quad-textured scene where only the first texture is visible)--actually, I'm perfectly willing to accept that this might be his mistake.

2. No w-buffer support, and none is ever likely to be added

3. Z-buffer resolution capped at 24 bits.

The combination of 2 and 3 is most distressing, since it seems there will be no solution for games (like flight sims) that might rely on high-precision Z-buffers.
 
antlers4 said:
1. Multi-texturing is broken in the current drivers (he gives screenshots as an example of a quad-textured scene where only the first texture is visible)--actually, I'm perfectly willing to accept that this might be his mistake.
That'd be a reasonable assumption. If multitexture was broken, wouldn't you expect problems on the 3D Mark multitexture test which uses eight (8) textures at once? Also, I know of other games using multitexture that work fine.
2. No w-buffer support, and none is ever likely to be added
Rightly so.
3. Z-buffer resolution capped at 24 bits.

The combination of 2 and 3 is most distressing, since it seems there will be no solution for games (like flight sims) that might rely on high-precision Z-buffers.
Show me it's a problem. Things that "might" be a problem don't concern me: I want real problems.
 
OpenGL guy said:
If multitexture was broken, wouldn't you expect problems on the 3D Mark multitexture test which uses eight (8) textures at once? Also, I know of other games using multitexture that work fine.
While multi-texturing in general might not be broken, obviously the path that Derek has been using, which works fine with other cards, is broken. I suppose it might be a path that was never guaranteed to work anyway. It would be nice if you could point to some ambiguity in the DirectX spec, as you did with the Z bias issue, that would explain this.

OpenGL guy said:
Show me it's a problem. Things that "might" be a problem don't concern me: I want real problems.

OK, a game I play alot WW2OL has a draw distance out to several kilometers. A 16-bit Z buffer causes obvious Z errors. A 24 bit Z buffer would have Z errors of about 1 meter at maximum draw distance, which would cause noticable artifacts.

Note: I calculate Z buffer error (with a formula I laboriously
worked out) as

e = p * z^2 / ( c - pz)

where p = z buffer precision, z = draw distance, c = distance to near clipping plane

Imagine a Space Sim where the draw distances are even larger--it would be worse.

You could get rid of the artifacts by supporting a W buffer or a 32 bit Z buffer.
 
antlers4 said:
OK, a game I play alot WW2OL has a draw distance out to several kilometers. A 16-bit Z buffer causes obvious Z errors. A 24 bit Z buffer would have Z errors of about 1 meter at maximum draw distance, which would cause noticable artifacts.

I think the point is: fire up the game and see if you see any errors. The way I took it is he's saying 32 bit Z-buffer is working.

As far as the W-buffer, do any Nvidia cards even support it? It seems to me that the W-buffer with the R8500 caused only problems. It's enabled by default in the driver and is generally left disabled, seemingly for the better.
 
Draw distance doesn't matter to z-precision. It's all in the near clipping plane distance. 24bit z is plenty, and it's not unique to ATI. NVidia only supports 24z/8s as well.

W-buffering? Who the hell still uses that? I thought it was removed from DX?
 
fresh said:
Draw distance doesn't matter to z-precision. It's all in the near clipping plane distance.

Specifically, it's related to the ratio of the far clipping plane distance to the near clipping plane distance.

And you can't just increase the near clipping plane all you want, either...too far, and things just won't look right.
 
Exactly. With a game like WW2OL, which supports a range of draw distances from a few centimeters to a few kilometers, lack of Z precision is a problem.

I didn't know that other cards had the same 24 bit limitation on Z buffering, so that shouldn't be a problem with the 9700 relative to the others.

WW2OL gives you the option of specifying W buffering instead of Z buffering if your card supports it in order to avoid artifacts with distant objects. Other cards do support W buffering.

I imagine this could be a problem in any flight simulator type game.
 
Yes, pretty much every consumer video card supports precision no higher than 24 bits in the z-buffer. The only card that I'm aware of currently that supports 32-bit precision is the 3DLabs P10.

32-bit precision for the z-buffer is one of the things on my wishlist for the next video card that I purchase.

Update:

24-bit z-buffer doesn't seem to me to be much of an issue for flight sims. In particular, I don't see much reason why you need to display 3D geometry very close to the viewpoint in flight sims, meaning you can have a significantly further near clipping plane than with an FPS.
 
1. Multi-texturing is broken in the current drivers (he gives screenshots as an example of a quad-textured scene where only the first texture is visible)

Almost most definitally him screwing up. Chances are if there was a driver bug with this it wouldn't be something like just the first layer visible, but something a lot worse :) I encounted one such bug on a quite old detinator driver where half the world was rendered currectly, and the other half had all it's textures heavily distorted. The problem in this case was me doing more than what the hardware allowed and the driver was validating it anyway and just rendering it really badly. The newer drivers made it fail validation correctly :)

The likely cases are: a) him intentionally screwing up to try and make ATI look bad. b) him un-intentionally screwing up making him look not so bright

2. No w-buffer support, and none is ever likely to be added
w-buffering is like z-bias (and EMBM) - it never really worked correctly (worked right in some scenarios, fell apart in others, didn't work at all on some drivers/cards, etc) and was never used by anyone who actually knew what it was doing - hence it dieing off. (just a little nudge at the z-bias issue - it is a well known fact that you should never rely on z-bias to work on ANY cards :p There is a paper floating around about how bad it really is, and how just modifying the near clip plane accomplishes much better, predictable results).

3. Z-buffer resolution capped at 24 bits.

Again, not a problem. In most games coming out you are going to want an 8-bit stencil to go with your depth buffer, thus making 32-bit depth unusable. 24-8 is the best combo for a 32-bit depth/stencil format. 24-bits of precision is plenty as long as you set out the near clip plane far enough. As many others have said, this is the same with nVidia's as well (though nVidia's drivers seem to randomly allow/disallow 32-bit depth with each new driver.. ran into a problem here a while ago)

Imagine a Space Sim where the draw distances are even larger--it would be worse.
You could put the far clip plane at infinity (see nVidia's infinite shadow volume paper) and still get good precision for everything that's visible (read: takes up a pixel or more on the screen). Near clip plane is the all important factor :)
 
Ilfirin said:
Again, not a problem. In most games coming out you are going to want an 8-bit stencil to go with your depth buffer, thus making 32-bit depth unusable. 24-8 is the best combo for a 32-bit depth/stencil format. 24-bits of precision is plenty as long as you set out the near clip plane far enough. As many others have said, this is the same with nVidia's as well (though nVidia's drivers seem to randomly allow/disallow 32-bit depth with each new driver.. ran into a problem here a while ago)

I really don't see why you need to pack the stencil data with the z-buffer. I really hope that some game-oriented hardware coming out in the near future supports a full 32-bit z-buffer, in addition to an 8-bit stencil.
 
Chalnoth said:
Ilfirin said:
Again, not a problem. In most games coming out you are going to want an 8-bit stencil to go with your depth buffer, thus making 32-bit depth unusable. 24-8 is the best combo for a 32-bit depth/stencil format. 24-bits of precision is plenty as long as you set out the near clip plane far enough. As many others have said, this is the same with nVidia's as well (though nVidia's drivers seem to randomly allow/disallow 32-bit depth with each new driver.. ran into a problem here a while ago)

I really don't see why you need to pack the stencil data with the z-buffer. I really hope that some game-oriented hardware coming out in the near future supports a full 32-bit z-buffer, in addition to an 8-bit stencil.

I'm not a hardware guy, so I don't know the reasoning behind it, but DirectX requires this. The DepthStencil surface is 32-bit, how those bits are used is decided upon the format (32D, 24D-8S, 24D-8X, 16D16X, etc). It always seemed to me to be a bit hacked up way of doing things, but i'm sure there was reasoning behind it.. I too expect this to change though.
 
Well, the obvious reason is simply memory bandwidth efficiency. The fewer different places in memory you need to access, the better your performance will be.

For this reason, it may be possible that we'll see, instead of 32-bit z-buffer and a separate 8-bit stencil, simply a 64-bit packed format (32-bit z/32-bit stencil, 48-bit z/16-bit stencil, etc.). I don't really know, though. I figure it's up to hardware developers to optimize, but greater than 24-bit z accuracy is most certainly going to be needed.
 
The problem with WWIIOL (and I imagine Battlecruiser :) ) is that it is a flight sim seamlessly attached with a FPS. For the FPS part to work right, the near clip plane has to be decently close (maybe 20 cm). To get the long draw distances for the flight sim (and for the FPS when you are on a hill...) you need the far clip plane some kilometers away. A 24 bit W buffer would have plenty of precision, but a 24 bit Z buffer lacks precision for the distant objects-- you can get Z artifacts.
 
A W-buffer does NOT have more accuracy than a z-buffer. The deficiencies are just sort of moved around.

Regardless, why should the game need to set the same near/far clipping planes for the different parts of the game? If the programmers haven't done this already, they're obviously not the best in the world...
 
Well, the programmers aren't the best in the world, I think that's pretty well known :rolleyes:

However, there aren't really different parts of the game that you can set different clipping planes for. It's an MMOG. While you are sitting in a plane, a soldier could walk up and shoot you with a pistol. When you are a soldier, a plane 3 km up or an artillery piece 3 km away can shoot you.

A W buffer has less precision up close, but more precision farther away. If the precision up close is superfluous, but you need precision father away, a W buffer is preferable.

With WW2OL's draw distances, a W buffer could have Z errors of < 1 mm throughout. A Z buffer would have essentially no Z error up close, but Z errors of about a meter (or more) at range. Z buffer Z errors will cause artifacting on distant objects.
 
It should still be possible to transparently change the near distance at, say, takeoff. When walking around on foot, I doubt you'll often be looking at things that far away...

But, I haven't played the game, so I don't really know for sure how well drawing in the far clipping plane would mesh with fighting on foot (it could be put further out again when piloting a tank, for example).
 
Stop using meters, kilometers and all that - please! :)

Scale is dependent on a game, some games might have 1 unit to be 1km, some might have 1 unit be 12.7ASDFs (ie: no relation to real-world units), some might have 1 unit be 1cm.. it changes with every game.

Setting the near plane to 1 unit away is the [bare] minimum it should be placed in all games, the more you can get away with setting it further away the better. Setting it to 1 unit away with the far at infinity usually still has plenty of precision.. the far plane isn't what matters :)
 
director.gif
....LET'S get ready to ruuumble !!

:LOL:
 
Status
Not open for further replies.
Back
Top