Welcome, Unregistered.

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Reply
Old 08-Jul-2007, 20:47   #1
ballsweat
Naughty Boy!
 
Join Date: Jul 2007
Posts: 71
Default draw distance/fog problem

I might sound like a complete retard in this post, but that's why I'm asking people who know a great deal more about it than I do.

Anyways:

1. if the w-buffer were to make a comeback (it won't unfortunately, this is just a "what-if") could it be mixed with the z buffer in any possible way? i know that there's a z far and z near clip, but could a z near clipping plane and a w far clipping plane be used simultaneously?

2. what could be done in future gpu architectures (if anything) to have equal precision for both close up and far away and also have infinite draw distance (no fog and no z-fighting artifacts)?

3. is the z-buffer capable of world projection or can only the w-buffer do that?
ballsweat is offline   Reply With Quote
Old 09-Jul-2007, 09:00   #2
Simon F
Tea maker
 
Join Date: Feb 2002
Location: In the Island of Sodor, where the steam trains lie
Posts: 4,447
Default

Quote:
Originally Posted by ballsweat View Post
I might sound like a complete retard in this post, but that's why I'm asking people who know a great deal more about it than I do.

Anyways:

1. if the w-buffer were to make a comeback (it won't unfortunately, this is just a "what-if")
By "w-buffer" do you mean where one does the per-pixel divide to compute a linear distance from the eye, or do you mean simply using the 1/w value to do depth comparisons?

The former is more expensive and, unless you absolutely have to have a certain absolute Z precision (i.e. using a fixed-point depth buffer), is probably of no real benefit. The latter, OTOH, if coupled with a floating-point Depth buffer, gives fixed relative precision, which is far more useful (IMHO).

Quote:
could it be mixed with the z buffer in any possible way? i know that there's a z far and z near clip, but could a z near clipping plane and a w far clipping plane be used simultaneously?
I don't understand what you are asking. You can set your clipping planes quite flexibly, but I don't see that you are going to achieve anything different in this instance.


Quote:
2. what could be done in future gpu architectures (if anything) to have equal precision for both close up and far away and also have infinite draw distance (no fog and no z-fighting artifacts)?
Do you mean equal absolute precision (i.e. 1 mm precision at both 1 m and, say, 10^6 m), or do you mean equal relative precision (e.g. 1 part in 1 million at a 1 metre distance and again 1 part in 1 million at 10^6m)? The former would require a large number of bits per pixel. The latter is simple to achieve with 1/w floating-point buffers. A certain PowerVR-based console, for example, supported this.

Quote:
3. is the z-buffer capable of world projection or can only the w-buffer do that?
"World projection"? You'll have to explain, sorry.
__________________
"Your work is both good and original. Unfortunately the part that is good is not original and the part that is original is not good." -(attributed to) Samuel Johnson

"I invented the term Object-Oriented, and I can tell you I did not have C++ in mind." Alan Kay
Simon F is offline   Reply With Quote
Old 10-Jul-2007, 00:13   #3
ballsweat
Naughty Boy!
 
Join Date: Jul 2007
Posts: 71
Default

Quote:
Originally Posted by Simon F View Post
By "w-buffer" do you mean where one does the per-pixel divide to compute a linear distance from the eye, or do you mean simply using the 1/w value to do depth comparisons?

The former is more expensive and, unless you absolutely have to have a certain absolute Z precision (i.e. using a fixed-point depth buffer), is probably of no real benefit. The latter, OTOH, if coupled with a floating-point Depth buffer, gives fixed relative precision, which is far more useful (IMHO).


I don't understand what you are asking. You can set your clipping planes quite flexibly, but I don't see that you are going to achieve anything different in this instance.



Do you mean equal absolute precision (i.e. 1 mm precision at both 1 m and, say, 10^6 m), or do you mean equal relative precision (e.g. 1 part in 1 million at a 1 metre distance and again 1 part in 1 million at 10^6m)? The former would require a large number of bits per pixel. The latter is simple to achieve with 1/w floating-point buffers. A certain PowerVR-based console, for example, supported this.


"World projection"? You'll have to explain, sorry.
thanks for replying=)

i'm a beginner though so I really don't know a whole lot about this.

anyways, yes, i did mean absolute precision.

also, by world projection i meant a huge scene visible from above with no fog.

so i have a few more questions.

1. the dreamcast i remember had a unique tile based rendering setup, and i also remember that many dreamcast games had some rather large draw distances without the need for fog. but why can't today's graphics cards do that? what do they do in place of 1/w depth comparisons? does the dreamcast have better potential for huge draw distances than today's hardware does? i'm worried that it might.

2. anyways, what else besides the w-buffer could give absolute depth precision? from the games that i know of that use the w-buffer from several years ago, they have much longer draw distances without any fog than the games of today and also the upcoming dx 10 games which i know use the z-buffer and have to use lots of fog to prevent z-fighting artifacts far away.

3. could you mix absolute precision for one clip plane and relative precision for the other? that was kind of what i meant. that's kind of a dumb question, i know, but i'm really wondering, b/c i miss the absolute precision of w-buffer games.

once again thanks for answering back. try to answer these questions when you have the time.
ballsweat is offline   Reply With Quote
Old 10-Jul-2007, 13:53   #4
Simon F
Tea maker
 
Join Date: Feb 2002
Location: In the Island of Sodor, where the steam trains lie
Posts: 4,447
Default

Quote:
Originally Posted by ballsweat View Post
thanks for replying=)
No worries.
Quote:
anyways, yes, i did mean absolute precision.
The question is "Is absolute precision really what you want"? Given that most things are displayed with a perspective projection and use LOD to adjust the detail of objects, I would think that relative precision is more useful.

Quote:
also, by world projection i meant a huge scene visible from above with no fog.
From above? Do you mean like the view of planet from orbit?

Quote:
so i have a few more questions.

1. ...i also remember that many dreamcast games had some rather large draw distances without the need for fog. but why can't today's graphics cards do that? what do they do in place of 1/w depth comparisons? does the dreamcast have better potential for huge draw distances than today's hardware does? i'm worried that it might.
It may be that people are either using fixed-point Depth buffers or, worse, floating-point but putting near=0.0f and far=1.0f. The latter is the wrong way to make effective use of a floating-point buffer.
Quote:
2. anyways, what else besides the w-buffer could give absolute depth precision? from the games that i know of that use the w-buffer from several years ago, they have much longer draw distances without any fog than the games of today and also the upcoming dx 10 games which i know use the z-buffer and have to use lots of fog to prevent z-fighting artifacts far away.
I'm not sure that many systems actually did use a (true) w-buffer since that requires a per-pixel divide at the Z-test stage....but I may be wrong. I've actually only seen the W-buffer mentioned in (IIRC) one of Jim Blinn's "corner" articles.

Quote:
3. could you mix absolute precision for one clip plane and relative precision for the other? that was kind of what i meant.
Clip planes typically work on floating point data prior to projection to screen coordinates, so I'm not sure your question makes sense.
__________________
"Your work is both good and original. Unfortunately the part that is good is not original and the part that is original is not good." -(attributed to) Samuel Johnson

"I invented the term Object-Oriented, and I can tell you I did not have C++ in mind." Alan Kay
Simon F is offline   Reply With Quote

Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 22:53.


Powered by vBulletin® Version 3.8.6
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.