32 bit RGBA frame buffer with the Dreamcast

Wikipedia. Don't worry about reading it. Just look at the example images near the bottom, and you'll see dithering in action. It basically gives the impression of more shades than the actual number of colours used. This in turn requires less RAM and bandwidth for drawing dithered graphics, although the quality isn't as good as if you spare the bits to get more colour/transparency resolution.

It's still used in games today. Wii maintains GC's 16 bit dithered colours, while even 32 buit platforms like PS3 and 360 have dithering when using the alpha-to-coverage technique to draw transparencies.
 
wtf is 32 bit color?

8 bits for red
8 bits for green
8 bits for blue

24 bits...

32 - 24 = 8 bits for alpha?

EDIT: nvm i answered my own question lulz.
 
The way I remember it is tiles always used 32bit internally, and FrontBuffer was usually 16bit to save VRam (which was a precious resource shared with display-geometry and textures, so it was likely common to use 16bit FB).

Because downsample to 16bit only happens on resolving to FrontBuffer, color loss artifacts are minimal (alpha blended stuff doesn't get screwed up and so forth), but yes, there will still be dithering of course.
To be fair, 16bit looks virtually perfect on SDTVs doing it this way (it only gets ugly if you do entire rendering in 16bit, which was more common with GC/Wii and PSP games).
But moving to fixed pixel HDTVs, things tend to look ugly anytime you use dithering and pixel resolution that isn't display native.
 
Just to add to Faf's reply, according to my spec, DC's framebuffer had 7 framebuffer pixel formats:
  • 0555
  • 565
  • 4444
  • 1555 (mapping of alpha from internal 8888 is determined by a threshold register setting (I think))
  • 888 (packed 24-bit RGB )
  • 0888 (32-bit RGB)
  • 8888 (32-bit ARGB)
For the 16-bit modes, there was a setting to enable/disable dithering.
 
Back
Top