3D aliasing

Just curious, so I thought I'd take the lazy way and posit the question before I really take the time to think it through...

When rendering to a 3D display device, such as shutter/polarized glasses or any other type, is there a new type of aliasing introduced in the depth direction? Or maybe a better way of asking the question is whether antialiasing the L/R 2D renders optimally takes care of the aliasing as perceived in 3D?

Or does something like (a very loose like) gamma corrected AA happen/need to happen, where "traditional" AA works but doesn't produce the optimal visual result. If not, what kinds of tweaks could be made to the way we render to improve the perception of smooth lines in the Z axis, or in all axes for that matter? For instance, an alteration such that each L/R 2D render no longer has optimal looking AA, but when visually combined the result is optimal. A sort of "overlapping of the eyes" approach?

I'm I speaking complete nonsense?
 
Not really expert on topic,so my guess is resolution (both screen and Z) plays main role in avoiding rendering artifacts that prevent user from experiencing immersion. Need for Z resolution is obvious ( one wants object to be seen exactly where it should be). Screen resolution helps with edge aliasing and also prevents depth discontinuity (when neighbour pixels have different depth).

again its wild guess and real problems may be somewhere else :)
 
The problem with aliasing is the appearance of certain types of undesired features (jaggies, popping, moire patterns, etc) in the picture; with stereoscopic displays, the images sent to the left and right eye will have these undesired features placed differently, in a manner that is very different from the depth-related differences that the brain normally expects; this could presumably serve to undermine the intended perception of depth, although I am not at all sure how big this problem is in practice.

Other than that, the rendering is just that of two 2D images instead of one. AFAIK, the brain performs a fair bit of feature extraction before the two images are combined; if this is correct, any antialiasing method that cannot fully remove perceptible aliasing artifacts in each image considered separately will also fail to remove perceptible aliasing in the combined image as well, and as such, standard 2D antialiasing methods are likely both necessary & sufficient to get an antialiased stereoscopic image.
 
Well, that's interesting. I think it is a given that 2D antialiasing in some form would be needed. It's the "does it need to be different?" that seems most interesting. Your first paragraph was a good way of stating what I was curious about... are there peculiarities of the way we build a 3D scene that might steer antialising in a certain direction, just as the way our eyes see brightness levels on a ~log scale steered antialiasing towards gamma correction.

I can't hardly imagine what direction that might be, but it seems an interesting question. I guess one possibility would be a different sample pattern for L and R render, so that when combined the perceived AA level is higher, or maybe the difference in camera angle would take care of that effect? Would there be an optimal sample that for whatever reason helps our brains assemble a smooth 3D image? Would that necessarily be the best 2D pattern? Any alterations in the gamma correction needed?
 
I don't think you'd need anything more than good 2D antialiasing. It could be distracting if the number of samples is low, though, because quality differences as you see the same edge from two different angles may become obvious. The requirement to linearize colors before blending them together doesn't change.
 
Not that it has much to do with this topic, but thinking about this made me realize just how cool effects could be in 3 dimensions. Many games use a "warp" effect when an explosion happens - waves propagating out - and I was just thinking about how much more realistic that would be in the depth direction. Or when the screen "shakes" from an explosion... the L/R renders could be shifted several pixels relative to one another and "shaken" so that you got the effect of your eyes vibrating and losing focusing ability... just like you really experience when you head is jarred around!

What else might be cool when you have depth to work with that is only "meh" in 2D?
 
Well, there's "aliasing" in the depth axis mainly affected by depth-buffer precision, which is exacerbated if you have large draw distances. When this sort of thing becomes a problem, you'll see z-fighting and weird self-shadowing artifacts with shadow maps and so on... however, none of these things are really a "new" wrinkle that appears as a result of using 3d or stereo display mechanisms. If you combat the problem in both of your stereo images and apply basic 2d AA on each, you're really just fine.

Now if you were talking about volume rendering, that's a different matter because there's actually a 3d raster grid, but I'm assuming that's not what you're talking about.

Not that it has much to do with this topic, but thinking about this made me realize just how cool effects could be in 3 dimensions. Many games use a "warp" effect when an explosion happens - waves propagating out - and I was just thinking about how much more realistic that would be in the depth direction.
Well, when you start getting into stereo-viewed warp effects and blur effects, you also have to start thinking about point-of-view and maintaining a lot of quantities based on world space rather than view space. For instance, if you do a regular 2d screen-space warp on both of the two stereo images, the two eyes see an effect which is independent of the "3d-ness" of what they see which causes some potential confusion or loss of depth perception. Now if the warp effect is based on some location of a 3d object in space and projected into each view accordingly, then you have an effect that doesn't confuse the screen for the scene.

I've seen some interesting things with some games inside CAVE setups. For instance, a game which features the floor waving around or even the bobbing-up-and-down that old FPS games did prove disorienting because your eyes are telling you something different from what gravity is telling you.
 
Last edited by a moderator:
Back
Top