RGB Signal range question

Colourless

Monochrome wench
Veteran
Probably not the best place to ask this sort of a question, but does anyone have an ideas (or guesses) what the effect, on a monitor, sending a RGB signal that was 1.3 times out of spec, would be?

i.e. The signal being sent is peaking at +1.3 instead of +1.0

Would something like colours leaking across the screen be an example of a possible effect?
 
What, changing the analog stage to output 1V instead of the spec'ed 750mV? I guess on an LCD you'd loose detail in bright areas as anything above 750mV level will be the seme full white, depending on how it handles gamma - in analog or digital domain etc). On a CRT you'd likely get some degree of blooming (a glow around bright objects). Again, depending on the monitor and it's setting of course. Since CRTs are analog I presume it will be easier to compensate for it there than with LCDs.

Regards / ushac
 
that looks like your convergence is way off.
You can think of your monitor (in this case) like a printing press - there is a red layer, a green layer and a blue layer applied to the screen.
From your "simulations" it appears that the red "layer" is shifted off from the others. This si called convergence (or lack thereof).
There should be a setting in your monitors controls to adjust convergence.
 
It's not a convergence problem. It very much appears to have something to do with signal problems. I'm 99% sure the card is outputting signals way out of range. The output is way brighter than it should be. Adjusting gamma to 1.0 (flat) gets rid of the problems.

I actually know why the signals would be out of range (it's partially related to gamma correction but not entirely), I just want to know if what I observe could be caused by the signals being out of range.

-Colourless
 
I remember once having a 15" CRT monitor where I could adjust the intensity of the R,G and B color components - if I tried to set the intensity too high, I would get bleeding of bright color components similar to what those two pictures show. Dunno the exact machanism, but I would guess that some of the circuitry in the monitor, when presented with a larger voltage than it is designed for, accumulates charge and then, when the voltage drops again, releases the charge - with the color bleeding to the right (following the direction in which the electron beam sweeps) as the result. So I'd say that the color bleeding sounds like a plausible result of signals being out of range.

Although, if you want to be 100% sure, have the card display a 100% white picture and probe its outputs with an oscilloscope or something.

Gamma correction should not affect the maximum signal level - it should, however, raise the signal level for the parts of the frame that do not already have extreme color values, thus potentially lifting a substantial amount of the scene out of the normal signal range and thus produce this color bleeding effect. If this is correct, a 100% black/white checkerboard pattern should look really crappy on your system regardless of gamma settings.
 
arjan de lumens, that is correct. Gamma correction should never increase the maximum signal level. However, I have managed to find a hardware bug, which in certain (very specific) conditions allows gamma correction is actually managing to do just that. It can in fact set the maximum level to 2.0 (assuming 1.0 is the normal maximum)! Way, way above normal.

A White/Black checker board pattern will look fine, if gamma correction is 1.0. As soon as gamma correction is increased, bleeding starts to occur.

Here's a hint as to what's going on:
Code:
output = 2*gamma_table[output/2];

-Colourless
 
Back
Top