MAX depth-buffer supported by current card?

DOOM III

Newcomer
i know i can look for answers in specs,but indolence makes me asking question here :LOL:

please tell me the MAX depth-buffer supported by GF3/4,9700/9500
 
I assume you mean depth buffer accuracy (bits per depth)

Okay, not an authoritative answer. I only have the GF3 at hand, and its caps show it supports 24 bit depth. I also know that ATI cards have traditionally (since Rage 128 days) had support for 32 bit depth. It's not enabled as standard, but can be enabled from the control panel. I'd assume that's the case with the 9500/9700, but won't be able to check until Monday. Don't know about GF4.
 
The maximum z-buffer depth for these cards is 24 bit. Maximum resolution of the depth buffer is of course identical to max framebuffer resolution. I guess these chips support a maximum resolution of 2048x1536, but as long as you don't hit the memory limit, you can still enable AA.
 
Don't forget integer or float depth buffer, which have different advantages/disadvantages.

ATI 9700 Pro reports 24 bit integer depth buffer under Dx9, I believe GeforceFX also support 24bit float depth buffer. The ATI 9700 can also (using slightly out of DX/OpenGL spec binding) support rendering to and from a 32 bit float buffer which could use a pixel/fragment program to emulate a 24 bit float depth buffer (but losing any depth acceleration hardware).

Resolution should be any render-target size which should be 2048x2048 on ATI. Yep just edited the code I'm playing with to check and it supports a 2048x2048 24DS8 surface and can use it perfectly.
 
So no FP depth buffer without using shaders? That's a little disappointing. I thought it would come naturally with the 128-bit frame buffer.
 
antlers4 said:
So no FP depth buffer without using shaders? That's a little disappointing. I thought it would come naturally with the 128-bit frame buffer.
Why? The FP render targets are only useful for storing immediate results. What do they have to do with depth buffering?
 
antlers4 said:
So no FP depth buffer without using shaders? That's a little disappointing. I thought it would come naturally with the 128-bit frame buffer.

If the card supports floating point depth natively, it will work without shaders. But if not you may be able to emulate it via shaders (though it involves binding the texture and render-target at the same time which is 'technically' out of spec for both GL and DX).
 
antlers4 said:
So no FP depth buffer without using shaders? That's a little disappointing.

The question IMO is rather: Just when is any FP buffer usefull without the use of shaders (PS 2.0 and up of course)?
 
You can use a 1-Z floating point depth buffer using just vs 1.1. Which can be handy (its has many of the same properties as a W buffer).
 
DeanoC said:
You can use a 1-Z floating point depth buffer using just vs 1.1. Which can be handy (its has many of the same properties as a W buffer).
It's only 1-Z because "someone" decided to put a subtract in the 'standard' projected depth calculations. It would have been cheaper and more accurate to leave them out in the first place!
 
Just checked the 9500. It doesn't have the control panel option that the 7500 has, for example, to allow a 32 bit Z buffer.
 
Regardless of whether they have a check-button to enable 32-bit Z buffer support, both cards only support 24-bit Z buffer (24 bit + 8 bit stencil, I believe)
 
Tagrineth, can you tell me where it is?

antlers4, why do you say that? AFAIK, at least for the Radeon 7500, it does have a true 32 bit Z buffer. ATI has always (that is, since Rage 128 days) had very flexibly Z buffer support (16-bit, 24-bit, 24-bit+8-bit stencil, 32-bit, freely mixed with 16-bit and 32-bit frame buffers). NVIDIA, on the other hand, had either 16-bit display + 16-bit Z or 32-bit display + 24-bit Z + 8-bit stencil, until the GeForce3, where they still have only these two Z buffer modes, but allow mixing the display and Z depths.
 
Xmas said:
ET said:
antlers4, why do you say that?
Because it is true. R300 does not support 32bit Z-buffer.

The R300 isn't "both cards." As I said, I haven't seen the 32-bit switch on a 9500 Pro, so I'm inclined to believe it doesn't support 32-bit Z unless Tagrineth shows me where that switch is. But I'm also inclined to believe that the Radeon 7500 does have 32-bit Z support.
 
Back
Top