30 bit Picture output for PC

Similar implementation to ATI's temporal AA in principle (temporal jittering): switching between different colors to create the appearance of a distinct new color intermediate, in this case with an intensity that it isn't actually possible for the hardware to create.

Seems like a pretty simple idea for still images with general graphics hardware power, except maybe for calibrating for accuracy of reproduction. This talk of a commercial product, etc., seems to indicate the latter is more of a problem for that than I realize...?
 
I don't think calibration is a problem, but the technique has to be integrated into the graphics application or the graphics driver. IMO it's not viable as a commercial product by itself.
 
Hmmm ... interesting. The idea sounds very simple. I feel tempted to try writing a demo for it. :)
 
Why are they acting like they discovered a new technique? Often used on anything that have not had many colors. Goodness people did it with the TI calculators and its been done as well with the Game Boy color. Plus all the old computers that did similar things.
 
The classic GameBoy did something like this for faked 50% transparency. Because of LCD lag, it worked really well! :) Some emulators have an LCD lag option too.

The ghosts in Wario land, and Link's awakening, others too.
 
Actually I believe that my LCD monitor already utilize this technique :) My LCD is a so-called fast response LCD (16ms), but it lacks color (256K, I think). However, it switches between shades fast to create the illusion of more shades.
 
The point which is not discussed here is what are the possible impications of 30bit color on image quality. This quote clearly indicates that human visual system is capable of distinguishing more colors than 24bit color system can provide. Is there any truth in this?

"Experts believe that the human eye can distinguish between 600 and 700 steppings within a color. This explains why steppings can be visible even in high-resolution 24-bit images. 30-bit is beyond that level, steppings are not visible anymore." While 30-bit does not often provide a dramatic increase in picture quality for everyday applications, especially graphic artists can take advantage of this technology to approach a more perfect image with no noise and purer colors.

Then again the inventor(?) thinks that 30bit do not have many everyday usage, read no "more colorful" games. Maybe people should discuss more on this aspect.

"I've found the differences between 24- and 30-bit color tend to only be noticeable in certain classes of images and, even then, tend to be quite subtle, so I expect serious interest in 30-bit color to mostly be limited to digital artists and those similarly involved with computer graphics," Silvas said. "I think graphics enthusiasts, digital artists and graphics professionals would enjoy the opportunity to use the demo to experience and evaluate 30-bit color for themselves, on their own machines, with the built-in samples and their own 48-bit TIFFs."
 
Not like HAM on the Amiga (which was a framebuffer compression trick) but like an Atari ST art program Quantum Paint.

Draw the frame with one set of colour, then draw the second frame with another set of colours the eye will integrate the two colours and produce a third...

Great way of inducing sickness as an added bonus.

http://membres.lycos.fr/abrobecker/STart/STart.html
 
This is nothing new.

Indy Presenter 1280 display has 18 bit color resolution. It has an option to switch to 24 bit mode which works by alternaitively displaying two images and letting LCD lag and your eye do the integration ;-)


Presenter 1280 user manual here said:
The default color mode on Presenter 1280 is 18-bit. In this mode, some smooth shaded images may appear banded. If you wish to decrease this banding and if you have 24-bit graphics, you can switch to 24-bit mode. In that mode, however some colors may flicker.

When you choose 24-bit mode, the display creates additional colors by averaging the true colors in 18-bit mode. The result is palette of 15,900,784 colors, not quite the 16,777,216 in true 24-bit.
 
I seem to remember that the Acorn BBC had something called blinking colours, eight normal and eight blinking. Same idea (temporal dithering)?
 
phenix said:
The point which is not discussed here is what are the possible impications of 30bit color on image quality. This quote clearly indicates that human visual system is capable of distinguishing more colors than 24bit color system can provide. Is there any truth in this?

yes. While the capability of the eye to detect 30b of YUV color is pretty poort, the eye is very good at detecting gradiations caused limit range per channel in a 24b display. A 24b display only allows 256 levels of intensity.

I can't remember the averages, but we can detect something like 3-400 shades of red, 4-500 shades of blue and 7-800 shades of green, and something close to 900 shades of gray.

Aaron
 
aaronspink said:
I can't remember the averages, but we can detect something like 3-400 shades of red, 4-500 shades of blue and 7-800 shades of green, and something close to 900 shades of gray.

And the worse part is those shades aren't spread out all nice and linear either so unless you have a curve representing how dense shades are in places even 10 bits isn't enough to get those 900 shades of gray.
 
pcchen said:
Actually I believe that my LCD monitor already utilize this technique :) My LCD is a so-called fast response LCD (16ms), but it lacks color (256K, I think). However, it switches between shades fast to create the illusion of more shades.

I believe it's also how 1 bit audio DACs work.
 
Back
Top