Max resolution

AAlcHemY

Newcomer
Can a human eye see the difference between 32bit color and higher ? Can a TFT or CRT display more than 24bit colors ?
/edit topic title is not correct :?
 
That's debateable. Every individual will have a slightly different perception.

I don't know off hand if any monitor supports the ability to truly represent 24 bit color, but I believe windows xp does.

sRGB was designed when the color gamut of monitors and printers was more limited than it is today, so a need developed for an enhanced color space that can encompass the entire range of human color perception. This enhanced version of sRGB began life as sRGB-64 and has been adopted by Microsoft as the color space for the GDI+ graphics API (present in Windows XP), becoming international standard IEC 61966-2-2. Now called scRGB, it uses a 64-bit encoding with 16-bits for each channel. It can present any color that can be reproduced by the best CRT and LCD monitors as well as the best color printers (with the exception of fluorescent colors).

quote taken from
http://www.cadenceweb.com/newsletter/sheerin/1202_1.html
 
I believe any monitor that takes an analog signal and does not do a digital conversion on it should be able to output a relatively arbitrary number of colors. Digital displays on the other hand are limited by a number of factors. The first, is the native format of the image, based on it's resolution and color depth. The second, is by the physical capabilities of the display. An analog display (in the form of a CRT) for instance would output an image close to that of the actual digital picture only limited by interference of the analog signal. LCDs that can display a true 24bit image (assuming we are talking about a 24bit digital input) should display the image exactly. Some LCDs are not capable of displaying a 24bit image and might dither it or use other techniques to display an approximation.

Currently, to display an image with a RGB color range greater than 24bit would require A) a greater than 24bit output from the computer B) either an analog signal or a greater than 24bit digital signal. C) an analog monitor, or a digital monitor capable of displaying the desired color depth.

The human eye is an odd thing. We are better at seeing variations in much darker regions than we are at variations in light regions. Thus, for a linear spread in varations of color over a certain depth (say 8 bits per channel), you likely will see more banding between the darker regions than you would in the lighter ones. Choosing a non-linear spread may alleviate this without increasing the bitness of the image. Another solution might be to choose a linear spread with a higher bit range to increase the number of samples in the darker range of colors. There are studies out there that document this, and probably give some idea of what the actual number of colors the human eye can generally precieve. My guess is that the difference between 24bit and 30bit color is fairly minimal. Anything over 30bit seems like it would probably be overkill.

Nite_Hawk
 
Ok, thx both :)
A last question, the output of the current videocards ( NV30/R350 ) is that 24bit ( on the DVI connector ) and analog on the 'normal' output ?
 
An old estimate I've heard repeatedly is that the human eye can distinguish around 10 Mcolours (new unit!), but I have no idea how that has been calculated or indeed defined.

I'm under the impression that computer screens for laptops don't go above 18 bits (16 in many cases) and resort to dithering. I have assumed that this applies to stationary LCDs as well.
 
Most new displays of any quality will display true 24bit color. Some new displays do dither to 18bit, but it's becoming more rare. The 16ms display used in the new hitachi and planar 17" displays are good examples. They use the same panel which I've heard is only capable of 18bit.

Nite_Hawk
 
Nebuchadnezzar said:
32bit tft's are still a few years off. I read somewhere they're only coming around 2006.

OK, I'm a bit dense...
Does that contradict Nite_Hawk's post?
I mean, are we talking 32 bits as in 32 bits of actual colour, or as in today's CRTs with 24 bits of actual colour fetched in 32 bits for memory efficiency?
 
horvendile:

My guess is he means 32bit actual color (though this is a bit odd). I've heard that there are 30bit displays on the horizon in the 2004-2005 timeframe, but nothing terribly substantial. A couple people thought that sharp had a 30bit display out now, but it's actually a 24bit display with some electronics to mimic a 30bit display (I can't remember exactly what it does, but I think it either down samples a 30bit signal, or tries to muck with the gamma to get 30bit "effective").

Nite_Hawk
 
Back
Top