# draw distance/fog problem

Discussion in 'Beginners Zone' started by ballsweat, Jul 8, 2007.

1. ### ballsweat Banned

Joined:
Jul 8, 2007
Messages:
71
I might sound like a complete retard in this post, but that's why I'm asking people who know a great deal more about it than I do.

Anyways:

1. if the w-buffer were to make a comeback (it won't unfortunately, this is just a "what-if") could it be mixed with the z buffer in any possible way? i know that there's a z far and z near clip, but could a z near clipping plane and a w far clipping plane be used simultaneously?

2. what could be done in future gpu architectures (if anything) to have equal precision for both close up and far away and also have infinite draw distance (no fog and no z-fighting artifacts)?

3. is the z-buffer capable of world projection or can only the w-buffer do that?

#1
2. ### Simon F Tea maker ModeratorVeteranSubscriber

Joined:
Feb 8, 2002
Messages:
4,524
Location:
In the Island of Sodor, where the steam trains lie
By "w-buffer" do you mean where one does the per-pixel divide to compute a linear distance from the eye, or do you mean simply using the 1/w value to do depth comparisons?

The former is more expensive and, unless you absolutely have to have a certain absolute Z precision (i.e. using a fixed-point depth buffer), is probably of no real benefit. The latter, OTOH, if coupled with a floating-point Depth buffer, gives fixed relative precision, which is far more useful (IMHO).

I don't understand what you are asking. You can set your clipping planes quite flexibly, but I don't see that you are going to achieve anything different in this instance.

Do you mean equal absolute precision (i.e. 1 mm precision at both 1 m and, say, 10^6 m), or do you mean equal relative precision (e.g. 1 part in 1 million at a 1 metre distance and again 1 part in 1 million at 10^6m)? The former would require a large number of bits per pixel. The latter is simple to achieve with 1/w floating-point buffers. A certain PowerVR-based console, for example, supported this.

"World projection"? You'll have to explain, sorry.

#2
3. ### ballsweat Banned

Joined:
Jul 8, 2007
Messages:
71

anyways, yes, i did mean absolute precision.

also, by world projection i meant a huge scene visible from above with no fog.

so i have a few more questions.

1. the dreamcast i remember had a unique tile based rendering setup, and i also remember that many dreamcast games had some rather large draw distances without the need for fog. but why can't today's graphics cards do that? what do they do in place of 1/w depth comparisons? does the dreamcast have better potential for huge draw distances than today's hardware does? i'm worried that it might.

2. anyways, what else besides the w-buffer could give absolute depth precision? from the games that i know of that use the w-buffer from several years ago, they have much longer draw distances without any fog than the games of today and also the upcoming dx 10 games which i know use the z-buffer and have to use lots of fog to prevent z-fighting artifacts far away.

3. could you mix absolute precision for one clip plane and relative precision for the other? that was kind of what i meant. that's kind of a dumb question, i know, but i'm really wondering, b/c i miss the absolute precision of w-buffer games.

once again thanks for answering back. try to answer these questions when you have the time.

#3
4. ### Simon F Tea maker ModeratorVeteranSubscriber

Joined:
Feb 8, 2002
Messages:
4,524
Location:
In the Island of Sodor, where the steam trains lie
No worries.
The question is "Is absolute precision really what you want"? Given that most things are displayed with a perspective projection and use LOD to adjust the detail of objects, I would think that relative precision is more useful.

From above? Do you mean like the view of planet from orbit?

It may be that people are either using fixed-point Depth buffers or, worse, floating-point but putting near=0.0f and far=1.0f. The latter is the wrong way to make effective use of a floating-point buffer.
I'm not sure that many systems actually did use a (true) w-buffer since that requires a per-pixel divide at the Z-test stage....but I may be wrong. I've actually only seen the W-buffer mentioned in (IIRC) one of Jim Blinn's "corner" articles.

Clip planes typically work on floating point data prior to projection to screen coordinates, so I'm not sure your question makes sense.

#4