3D Glossary Update

Dave Baumann

Gamerscore Wh...
Moderator
Legend
Our 3D Glossary is woefully out of date and is feeling mighty rejected, so its probably about time we got around updating it. So, if you fancy helping out at all then please put some down some of the recent terms that are missing and if you want to suggest explainations as we go along then feel free to do so. Lets see if we can drag it into 2002/3 and the DX9/OpenGL2 era!

Cheers.
 
Some random terms (mix of old & new stuff; the glossary is OLD):

Double-buffering: The use of 2 buffers for rendering; at any given time, one buffer is being displayed on the screen, while the other buffer is being rendered to. Whenever rendering to a buffer completes, it is then shown to the screen, and the renderer starts to render to the other buffer. This way, the screen will show only frames that are finished drawing, and not frames that are in the process of being drawn.

Alpha-test: For each pixel, an alpha value is normally computed. Alpha-test refers to the process of conditionally either draw or reject each pixel based on a comparison between its alpha value and an alpah reference value. It is generally used to give the illusion of high geometric detail, particularly of trees and other plants.

Multiple Render Targets: Certain renderers allow the data computed for one pixel to be written to multiple different buffers at the same time - these buffers are typically frame buffers and/or floating-point buffers with the same resolution as the frame buffer (Z and stencil buffers are not counted). The pixel is written to the same position in all the buffers, but the data written need not be the same for all buffers.

Frame buffer: A data buffer holding RGB color values for each pixel on the screen, for the purpose of being actually displayed on said screen.

Alternate Frame Rendering: A setup where two renderers alternate on drawing frames. First, one of them starts rendering one frame. Then after some time, the other one starts drawing the next frame. The first renderer will then complete its frame, display it, and start working on another frame. And so on. By alternating frames between two renderers this way, one gets about twice the frame rate that one would get with only 1 renderer, although the delays from one frame to the next tend to be somewhat uneven. The most well-known implementation of this technique is ATI's Rage Fury MAXX.
 
Some multisampling related stuff (I've got too much spare time on my hands):

Multisample Buffer: Frame buffer capable of holding more than one set of RGB (+ often Alpha) values per pixel - each RGB(A) value set is then referred to as a "subpixel" or "sample". Normally used for Multisampling or Supersampling anti-aliasing techniques, but can be used for other effects as well if Multisample Masking is supported. When displayed on the screen, the average of the subpixel values is typically the color that ends up being displayed.

Multisample Masking: Let the subpixels for each pixel in a multisample buffer be numbered from 0 and upwards, 1,2,3, etc. Multisample Masking then refers to the capability to restrict rendering to subpixels with specific indexes (like e.g. subpixel 2 in each pixel). Multisample Masking can then be used to achieve e.g. depth of field or motion blur effects. Multisample Masking is an important part of the "T-Buffer" functionality of 3dfx Voodoo5 cards.

Multisampling: Rendering technique that uses a multisample buffer to perform full-screen anti-aliasing. For each pixel, a single RGB(A) color is computed and then written to each subpixel that is deemed to be inside the area covered by the current polygon. The resulting effect is essentially that of a sort-independent edge anti-aliasing.
 
arjan de lumens said:
Frame buffer: A data buffer holding RGB color values for each pixel on the screen, for the purpose of being actually displayed on said screen.
IIRC, the OpenGL definition of a frame buffer differs from this. I think it defines it as something like "the colour buffer, and other associated buffers (depth, stencil etc)". Too lazy to look it up =[
I find the 'standard' definition more useful, but otoh 'colo(u)r buffer' a better name.
Also, isn't "RGB color values" narrowing it a little too much. "color values" would suffice.
 
IIRC the frame buffer can actually be the term for the entire pool of memory available locally to the card - so this covers all the extra buffers.

Anyway, thanks for the suggestions so far - keep em rolling in as I will be doing some updates as we go along.

BTW - do you think the term 'loopback' should go in as a reference to being able to apply more textures per pass than the number of physical texture sampling units available to a pixel pipeline (as in GF3/4 with its 4 texture layer and only 2 texure units and 9700 with 16 layers but only one sampling unit). 'Loopback' has kinda been the commonly given term to the process.
 
Considering how many people have been way off base with DX9 TMU requirements, I believe that loopback should indeed be included. Seems not too many people know what it is or what it enables.

I haven't looked at the 3D Glossary in a long while...Perhaps "single pass" and "single cycle" should be included as well? Or is that something more fitting for a 3D FAQ?


--|BRiT|
 
Yes, reading the OpenGL standard, I find that the term "frame buffer" indeed covers both RGB(A) color buffer, Z-buffer and stencil buffer, as long as it is all per-pixel data (data areas used for vertices or textures would presumably not count as part of the "frame buffer"). The definition I gave is the definition OpenGL gives for "color buffer".

Also, a few issues that I noticed on present glossary entries:

AGP: The sideband is an address bus that can be used to pass memory addresses and read/write commands from the videocard card to the northbridge even during data transfers. It is entirely possible for the videocard to read/write textures to/from system memory even if the sideband is missing.

Megapixel: 1 megapixel = 1,000,000 pixels, not 100,000.

Pipeline: the defintion given looks rather confused.

Texture compression: The entry ends mid-sentence.

VRAM: looks like there is a formatting error there - the VRAM description appears in the middle of the volumetric lighting entry.
 
arjan de lumens said:
Also, a few issues that I noticed on present glossary entries:

AGP: The sideband is an address bus that can be used to pass memory addresses and read/write commands from the videocard card to the northbridge even during data transfers. It is entirely possible for the videocard to read/write textures to/from system memory even if the sideband is missing.

Megapixel: 1 megapixel = 1,000,000 pixels, not 100,000.

Pipeline: the defintion given looks rather confused.

Texture compression: The entry ends mid-sentence.

VRAM: looks like there is a formatting error there - the VRAM description appears in the middle of the volumetric lighting entry.

i'd like to add here the 'gouraud shading' entry - come on, we're all (more or less) tech-minded people here, that 'chicken wire' counter-analogy is not particulary, erm, educational. neither is it true - one can produce gouraud-quality output with flat shading only, it'd just take larger polycounts.
 
Some additional random terms:

Rasterization: The process of converting a primitive (line, polygon) from a geometric description to a sequence of pixels for further per-pixel processing later on. (? - want input on this one)

Frame buffer (take 2): A data buffer that contains a set of data for each pixel on the screen - includes RGB color data and may include additional data, like Alpha, Z and Stencil, as well.

Pixel Shader: A program that is run for each pixel that is rendered. Also used for a rendering system capable of running such programs. The pixel shading functionality normally appears as a replacement for multitexturing in the 3d pipeline. Under the DirectX API, version 8 and up, pixel shaders are given version numbers to indicate their exact capabilities.

Triple-buffering: The use of 3 buffers for rendering.Triple buffering allows the frame being displayed to be swapped on V-sync only, while not stalling the renderer when waiting for V-sync. To fully understand how triple-buffering works, each buffer can be considered to be in one of 4 states:
  • Being rendered to
  • Waiting to be displayed
  • Currently being displayed
  • Invalid (contains stale data)
At any given time, one of the 3 buffers is being rendered to, and another one is being displayed. The third buffer may be either waiting or invalid. If there is a waiting buffer at the time of V-sync, then the waiting buffer is changed to being displayed, whereas the buffer that was being displayed is put in the 'invalid' state. Now, when the renderer is finished rendering a frame, it will put that frame in the 'waiting' state, then grab the buffer that was in the 'invalid' state and start rendering to it. (If the renderer is faster than the monitor display rate, there will at times be multiple 'waiting' buffers and no 'invalid' buffers for the renderer to grab. In this case, the renderer can either wait or reuse one of the 'waiting' buffers)
This way, one can achieve the same framerate as when doing double-buffering without waiting for V-sync without suffering the frame tearing that would result from swapping displayed buffers mid-frame.

Gamma correction: The color intensity that is displayed on a standard monitor is related to the video signal from the RAMDAC in a non-linear manner. The relationship is typically roughly like this: output = constant * input^2.2 . Gamma correction is the process of correcting for this nonlinearity before the video signal leaves the RAMDAC - it is typically implemented as part of the RAMDAC functionality. Gamma correction tends to reduce the effective color precision slightly.
 
Gouraud Shading: lighting method where a color is computed using the full set of standard lighting equations at each vertex of each polygon, then the color is linearly interpolated across the polygon. [do a google search for gouraud shading, and this is what you will find all over the place]

[Note: if you try to approximate gouraud shading by flatshading lots of really small polygons, you end up doing the full set of lighting equations for nearly each pixel, approximating Phong Shading instead.

And "chicken wire"? Does that refer to wireframe models?]
 
arjan de lumens said:
Some additional random terms:

Rasterization: The process of converting a primitive (line, polygon) from a geometric description to a sequence of pixels for further per-pixel processing later on. (? - want input on this one)

Being pedantic rasterization shouldn't be converting a primitive to fragments? And then define 'fragment' ;).

arjan de lumens said:
Pixel Shader: A program that is run for each pixel that is rendered. Also used for a rendering system capable of running such programs. The pixel shading functionality normally appears as a replacement for multitexturing in the 3d pipeline. Under the DirectX API, version 8 and up, pixel shaders are given version numbers to indicate their exact capabilities.

Rather than 'rendering system' (it seems a whole machine :p) perhaps could be better to say something like 'unit' or 'GPU processing unit'.
 
uhh, folks, you are being waaaay to technical.
users don't need definitions. these are for tech savy people. they need explanations. we, the discussing ones, are quite familiar with those terms/ that's why we are in need of definitions. But average user wants to know what some fancy term actually does, and what he benefits by enabling it.

Or, my second suggestion would be that eavh term has some "stupid" explanation and an "advanced" one. For example:

Gurard Shading

for dummies:
Quite better than flat shading, but still fast method for shading polygons. it is done by taking most extreme brightness values on the opposite edges on the polygon and in between "fade" from one to the other.

for experts:
lighting method where a color is computed using the full set of standard lighting equations at each vertex of each polygon, then the color is linearly interpolated across the polygon.


quite a difference, isn't it?

(edit: typos)
 
arjan de lumens said:
[Note: if you try to approximate gouraud shading by flatshading lots of really small polygons, you end up doing the full set of lighting equations for nearly each pixel, approximating Phong Shading instead.

yes, of course, my bad. must have been thinking of something else then.
 
Arjan:
I would say that the framebuffer only contains RGB + possibly A. Other information would belong to its/their own buffer/s.

Gamma correction: Surely gamma correction is more like the process of knowing the gamma of the image capturing device and compensating the values so that it matches the overall gamma of the output display system.
 
Now, we have a problem: some terms seem to have multiple conflicting definitions, with people disagreeing on which one is 'right':
  • Does framebuffer include RGB(A) color data only, or Z and stencil data also? (The OpenGL standard says the latter, and uses "color buffer" for the RGB(A) data.)
  • The process of rasterization includes determining which pixels are covered by a polygon. Does it also include the process of determining a color and Z value for each of those pixels as well? (The OpenGL standard includes gouraud shading, texturing, fog into 'rasterization' but not Z/Stencil/alpha test or framebuffer blend.)
As for gamma correction: you would still need to define "gamma".

Point taken that my gouraud shading definition (and possibly others?) was overly technical.

As for pixel shaders: replace "rendering system" with "renderer" or "GPU".
 
OK ... try adding this to the pixel shader definition: Pixel shaders can be used to perform a wide range of per-pixel effects, such as bumpy reflections, refraction, Phong Lighting, procedural texturing, and so on - the exact set of effects possible to program will depend on pixel shader version. [should probably also have a few links to or pictures of some examples of what can be done with pixel shaders]
 
arjan de lumens said:
Gouraud Shading: lighting method where a color is computed using the full set of standard lighting equations at each vertex of each polygon, then the color is linearly interpolated across the polygon.

Arghhh. Gouraud shading is not a lighting method, it's a shading method. Nothing to do with lighting. The confusion tends to come from Phong Shading and Phong lighting.

Gouraud Shading: Shading method where the colour is computed by linearly interpolating the vertex colours across the polygon.

Phong Shading: Shading method where the normal vector is computed by interpolating the vertex normals across the polygon.

Lighting Model: A set of equations which describe how to calculate the colour of a point given information about the light position, geometry and material. A Lighing Model may be used to calculate the colour at each vertex (e.g. as input to a Gouraud Shader), or for each fragment (e.g. being driven from a Phong Shader).

Phong Lighting: A specific, commonly used, Lighting Model.
 
Back
Top