10M HDTVs sold, 15.5M by end of 2005

LB,

The confusion for me and KnightBreed stems from this:

Let's compare a 5000:1 PDP to a 40,000:1 HDR display. On a PDP, you can display 5000 different brightness levels. On a HDR you can display 40,000.

Because AFAIK, Contrast Ratios (the 5000:1 or 40000:1 he used) do not imply how many steps there are (unlike varying bit depths for color palettes). To see how CRs are calculated, read KB's post.


KnightBreed said:
Where did this information come from, that a contrast ratio of 10,000:1 has 10,000 brightness gradients to "choose from"? The constrast ratio is a measured valued that is determined by one of two ways:

That was my next question as neither the Full ON/OFF method (manufacturers prefer this because it gives them higher numbers) or the ANSI method have anything to do (or so I thought) with 'steps of contrast'.

In fact digital equipment will have a certain number of shades of gray (2048, 4096) that has nothing to do with its CR.

That is why I didn't understand his analogy either but really am trying to.[/quote]
 
The LED backlist LCD screens have inherently higher contrast, being able to show more shades is just gravy (or the other way around, depending on your preferences).
 
Glad to see there are some people here who have not been marketing-hyped into the whole 40,000:1 scene (no pun intended). I had never heard of the 100:1 thing, but it makes a lot of sense. Once you sit down in a single viewing "setup", your pupil is going to acclimate to a particular level, after which the rods and cones in the retina will only be able to distinguish only so many levels between full on to off. A display that can wildly blow out of this 100:1 range from one scene to the next will end up causing just as much problems as impressive 40,000:1 may sound. So the pupil just has to readjust to scale to the higher levels. Then if the next scene is back down to the lower range, you "blind-out" until your pupil can adjust again and your retina can re-sensitize. So what good is that? Now you can argue that maybe a scene contains whiteness and darkness levels that contrast to 40,000:1 (as an example), in of itself. Your pupil is still going to adjust to an average brightness range for the entire picture. You'll get a 100:1 contrast range somewhere in the middle, and stuff outside of the range will either wash out to full black or full white. So, again, this capabililty ends up being a rather dubious claim taking into account what human eyes are capable of and how they work.

To add to this, consider the 24-bit color system- 16 mil colors, or so? Yeah, but that's a "marketing" number. For any one color, the most it can achieve is 256 possible shades between full-on and off. That includes white, it includes red, it includes blue, it includes green, and it includes any combined color that you could achieve in an additive color system. So realistically, the most CR you can genuinely achieve from that 24-bit color system is 256 shades of any specific color. It's not a matter of being able to discern 16.7 mil colors. It's a matter of being able to distinguish a pattern of colors that only have the resolution of 256 unique shades. That is often what you are seeing when you see the banding in digital video. The image had to jump from 1 shade to the next adjacent shade, where it could have used 3 or 4 finer shades in-between to really make a seamless transition. You don't have those in-between shades, because you only have the 256 shades supported by the system.

Now this is not to say that you cannot achieve higher contrast ratios of 256:1 on these "high contrast" displays (say, anything that claims over 300:1). You can certainly goose the contrast control. However, you then enter the realm of making the picture look more unnatural than natural, not to mention less accurate/faithful to the original signal the digital stream has delivered to you. The 24-bit digital video was intended to accurately deliver 256:1-ish sort of images, so boosting it to 40,000:1 after the fact, is really begging for additional problems. Now let us go back to how distinguishing banding on certain 24-bit video content is just on the verge of perceptibility? That 256 shade of a given color resolution is falling short. Now imagine arbitrarily blowing out the contrast control on a scene such as that, and you will have a real problem with banding becoming more than just on the verge of perceptible. Is that quality, high-performance video? The CR is mind boggling, right? However, it also succeeded in pushing the capabilities of 24-bit video into a zone it does not belong.

A 400/500/600/1 million:1 CR system sounds great, but not any existing digital video format we use today will be able to drive it to its full potential. It's just not up to the task.

...and as LB correctly surmised, yes, we most certainly would have to worry about "tanning" and snowblindness if we had TV's that actually displayed the full contrast level of real life (it's why we have adjusting pupils, so we are not terminally white-blinded or black-blinded by mere daily life on Earth). That could really break the bounds of what is considered enjoyable viewing. Suffice to say, "real" is good, but there is also a point where it is "too real" for its own good.

Where we could really find some benefit is a new digital video model that supports greater color shade resolution than 256. That would address the banding, and hopefully alleviate artifacting in low light scenes. For example, if we could have a 32/33-bit color system and then use a contrast ratio filter that squeezes the CR back down to what we already have with the 24-bit model (i.e, same ultimate CR, more shades/finer shades). Banding should be seamless (if not nearly). Oh, that and perfect a way for artifact-less lossy compression. That would handle a lionshare of what ails current digital video, rather than worry about 40,000:1 CR displays.
 
Heh, given the feuding rivalry between me and him, what do you think I would say? :D Just take my post as you will. Accept or reject, and I thank you for the ability to allow me to express my point. ;)
 
randycat99 said:
Heh, given the feuding rivalry between me and him, what do you think I would say? :D Just take my post as you will. Accept or reject, and I thank you for the ability to allow me to express my point. ;)


:devilish: But i'm confused. I was getting used to the idea that PCEngine was right, but then he isnt?

So, WTF is the CR? (which is the question i asked 3 pages ago)
 
Why, CR is the contrast ratio, of course! :p ...the ratio of the brightest element to the darkest element in an image (and that isn't even necessarily how TV's acquire their specs). Where I think a lot of people get mislead, imo, is considering that there are CR bottlenecks in various stages of the chain (from the DVD player to your brain, for instance). The 24-bit video of the DVD has its ceiling. The monitor has its ceiling. The mechanics of your eyeballs has a ceiling.

For the sake of argument, the DVD ceiling is 256:1, the monitor is 500:1, and your eyeballs is 100:1/adaptive. Naturally, the lowest ceiling will dictate the ultimate performance of the entire chain. You can't just wildly augment one component of the chain (in this case, the monitor, and it wasn't even the weakest link, to begin with) and think it will boost performance universally. So what if the human eye can adaptively adjust to accomodate tens of thousands of CR? Once it picks a setpoint, it is only good for 100:1 at that brightness level. It can only accomodate other ranges by picking a different setpoint, and only 1 setpoint can be used at a time (if you will be frequently changing to wildly different setpoints, you will probably end up with a headache and be blind most of the time as you transition to vastly different setpoints). So then it falls to the 24-bit digital video. Unfortunately, we find that the problem really isn't contrast ratio, but resolution of shades of an individual color (the banding scenario). So it turns out that venturing forth with the uber HDTV with obscene CR is hardly addressing the weakest point(s) in the chain, in the first place. Can't really do much about the human eyes (barring superhuman prosthetics). That leaves the 24-bit digital video... That's where we need enhancement. Then HDTV's need to move to be compatible with whatever that enhancement is (they sure won't be compatible with only 24-bit digital as their standard source). Naturally, we cannot expect them to get equipped for said enhancement w/o clear indication that there is, indeed, a movement involved to surpass 24-bit digital video. What does that mean? Probably nothing will happen, and we will just get fed the "virtues" of HDTVs with CR pushed to ever more ridiculous levels...
 
Dunno. I think its a technical explanation of something or other :?

Point is, on a TV you've got dark and light. Crank up the brightness and the darks start to become grey. Higher contrast ratio's on a monitor means you can get good contrast with bright screen AND dark blacks.

The reason they have a number is so manufacturers can boast that their TV is better than their competitions. And as we know, such numbers get 're-specified' every once in while, so it may well be different manufacturers start listing CR's from different scales.
 
I dunno, a clever, systematic editorial boost of LCDs at the expense of plasmas? :devilish: (hey, a guy can have an opinion, right?)

As for the gray scale charts posted earlier, if he really wanted to illustrate something truer to life, there would be the first gray scale (the original). The second grayscale (representing an "average" monitor) would be longer with the red line falling beyond the length of the first grayscale bar. The third grayscale (representing the "uber CR" monitor) would be much, much longer with the red line falling well beyond the length of the "original" grayscale bar. What's the moral to this? Once you have exceeded the "original", you are pretty much covering the same contrast range regardless of the CR improvement. You can't improve upon the original w/o providing a better original in the first place. Now if monitors typically had less CR than 24-bit digital video, then the grayscales as they were presented earlier would have been more applicable. As such, this is not the case in reality.

You can be sure that if it was his intent to imply the "original" represents the image of actual realworld reality, with complete disregard to the medium that actually feeds HDTVs (that thing called 24-bit digital video), well, I'm sure you can see the logical problems involved with that assertion. Curiously, a 40,000:1 CR display won't be getting much use of its potential until we are running a 75-ish bit digital video system implemented from film to video input. :oops: Nutz, right?
 
I don't know what randycat is ranting and raving about, but the contrast ratio of a display is extremely important in reproducing a lifelike image. That comment about 40,000:1 contrast ratio requiring 75-bit color rendition is complete bologne. A CR of 40,000 only means that the brightest white is 40,000 times brighter than the blackest black the display can produce. The "bit-level" is irrelevant for determining CR. Obviously it's important when figuring out how many different colors it can display.

The human eye might be capable of seeing 300:1 at any given moment, but the range of brightness levels that we can see is fantastically higher than any HDTV can reproduce. I know that is kinda vague, but consider this example. The range of brightness in real life might look like this ("lumens" being a standard unit of brightness):

Code:
|--------------------------------------------------------------|
0 lumens                                                      10,000 lumens
dark cave                                                      the sun

At any given moment, because of limitations of the eye, a human might be able to see only a limited range of brightness. The pupil will constrict or open to let a certain amount of light to the retina, like so:

Code:
|---------------------------*_____________*------------------|
0 lumens                                                      10,000 lumens
dark cave                                                      the sun

But that also means this dynamic range can be anywhere on that big graph of real life brightness levels. It could be here:

Code:
|-------------------------------------------*_____________*--|
0 lumens                                                      10,000 lumens
dark cave                                                      the sun

Or here:

Code:
|--*_____________*-------------------------------------------|
0 lumens                                                      10,000 lumens
dark cave                                                      the sun

The monitor must be able to cover this "real world" range so that the eye becomes the bottleneck, not the monitor. If I'm watching a movie on my projector and there is a scene that is absolutely pitch black, like the scene in Kill Bill2 where the lead character is buried in a coffin underground, my projector should show a zero lumen image. But it doesn't. Even in this scene, if I were to wave my hand in front of the projector I would still see my shadow on the screen. If the projector was truly displaying zero lumens, I shouldn't see a shadow (no light, no shadow, correct?).

This is a limitation of every digital television. Some technologies are better than others, but nothing is perfect. My projector is DLP-based and it still has this problem. LCD, for example, is the worst for black levels and contrast ratio.
 
Do you expect to use the existing 256 shade of a color resolution (inherent to 24-bit digital video) to cover an entire 40,000:1 contrast range and not find severe banding problems??? It's barely capable of covering a technical 256:1, and you want to blow that up over 100x? Yeah, go do that... ;) Hence, the notion you will need a "few more bits" in the process to resolve that target range proper. That works out to over 25 bits per color, so 75+ bits for a 3-color system. Viola!

EDIT: Ok, guess I made a leeettle math error (lack of sleep makes you do strange things, I guess). That should be over 15 bits per color, so 45+ bits for a 3-color system. Sorry for the confusion. Oh, and Viola! Not quite as tremendous as 75 bits, but still pretty far out of the ballpark relative to 24 bits

The bottomline is you theoretically only really need to cover the range the eye is capable of resolving at a single setpoint (over a small range of setpoints for comfortable overkill, which is what we got now). The brain will get the cues it needs to understand if you are watching a scene of a dark cave vs. a scene of staring at the sun (never mind why would you want to ever accurately simulate staring into the sun- can you say lawsuit fodder?).
 
lb, for all intents and purposes CR = number of brightness levels of each pixel. Also it is NOT the number of greys. There's a difference between the number of greys and the number of brightness levels. You can have a display that's only capable of black or white, but can have each of those black or white pixels have independent brightness levels of any arbitrary number. There is ZERO point in rating a display with a CR of say 1000:1 if the display isn't capable of 1000 levels of brightness for each individual pixel. For example nobody cares if a display has a 1000:1 CR if for any particular scene if it can only display the brightest white and darkess black with nothing in between.

The number of greys is called it's grey scale range/capability. This is usually connected with the display's color range, but sometimes it's enhanced independently beyond the color range. For example some PDPs have a color range of 1 billions colors. That usually means it has 1000 levels of grey, but some displays with 1 billion colors can have more than 1000 levels of grey for whatever reasons.

Again think of the color pallete versus simultaneous colors displayed analogy from my previous post. For example compare the SEGA Genesis to the TG16. Both have a global color palette of 512 colors, but the Genesis can only display 64 at any one time while the TG16 can display 256 or more. Now subsitute total brightness palette and simultaneous brightness levels into that situation and you'll have an idea what I'm talking about.

CR!=DR, but for all intents and purposes they're about the same when talking about displays.

Another interesting example is this. If you diplayed a bunch of black lines 1 pixel in width on a white background close to each other, you see grey at a certain distance. Does this mean humans can't see the difference between the black and white? I don't think so, because when you get closer you can see the individual whites interleaved with blacks.

Another interesting example. Take 1000 levels of grey and multiply it with 40,000 levels of brightness for EACH of the levels of grey and you have even more shades to paint with. Keep in mind I'm talking about the brightness levels of each individual pixel NOT the brightness adjustment on your tv which changes the brightness of all the pixels simultaneously.

lb if you want to believe I'm wrong then go ahead, but I'm just giving you something to think about. I'm just trying to help you make some sort of sense out of all of this.
 
The eye sees a subset of the brightnesses. You can't see, in the same view, sun glaring off white snow at 5,000 lumens AND details on guys dark jacket in the shadow at 1,000 lumens. The 256 greys only needs covers the area of detail that the eye can perceive from 'everything-below-this-light-level-is-black' to 'everything-above-this-level-is-white'

Personally I can't see what the big deal is with HDR TV's. I don't really want a TV that can output solar-brightness levels. I don't want to pay the electricity bill and I don't want to be blinded when the film cuts from a dark, gritty night-time scene to outside in the desert to looking up at the sun! If a TV picture showed the full, absolute dynamic range of 0-40,000 lumens (or whatever the range is), when shown on screen my eyes would adapt to the brightest parts (so not to be blinded) and the darkest parts would appear black. By instead limiting the display's visible dynamic, the photographer selects the detail pertinent to the scene. He/she can choose exposure to highlight whatever part of a scene is most appropriate.

Trickery can reproduce accurate optical phenomina if required. eg. when driving towards the sun in a driving game, reduce the brightness of the scene relative to the sun object. This simulates the eye adapting to bright source, shrinking the pupil, and reducing receptiveness to details in the darker areas.

What would]/i] be useful is HDR cameras, where the filming can take in all the scene and select from/compress the dynamic range to present more detail than would naturally be viewable.
 
(to pce) Are you perhaps suggesting a more adaptive digital video model that has a typical 100:1 CR but also includes a global brightness scaling factor that can move that 100:1 range from the very bottom of the target 40,000:1 range to the very top (along with setpoints in-between, of course)? I guess I could see that (no pun intended). The 100:1 part of the digital code would save a few bits, and then you have a single brightness scaler to specify a gain range of 400:1? So maybe you need 4 bits for the first part and 9 bits for the 2nd part? ==> 4x3 primary colors + 9 ==> 21 bits?
 
Shifty Geezer said:
Personally I can't see what the big deal is with HDR TV's. I don't really want a TV that can output solar-brightness levels. I don't want to pay the electricity bill and I don't want to be blinded when the film cuts from a dark, gritty night-time scene to outside in the desert to looking up at the sun! If a TV picture showed the full, absolute dynamic range of 0-40,000 lumens (or whatever the range is), when shown on screen my eyes would adapt to the brightest parts (so not to be blinded) and the darkest parts would appear black. By instead limiting the display's visible dynamic, the photographer selects the detail pertinent to the scene. He/she can choose exposure to highlight whatever part of a scene is most appropriate.

Trickery can reproduce accurate optical phenomina if required. eg. when driving towards the sun in a driving game, reduce the brightness of the scene relative to the sun object. This simulates the eye adapting to bright source, shrinking the pupil, and reducing receptiveness to details in the darker areas.

FAQs:

What does high dynamic range mean?
The dynamic range of a display is the maximum range of brightness that it can produce. The peak white and black are usually measured seperately, and reported as a ratio of white-level/black-level. Using this technique, our displays have an infinitely high dynamic range, because our white-level is so high and our peak black is 0 cd/m2. HDR signifies a dynamic range significantly greater than those of most current displays.


What is the contrast ratio?
This is the range of brightness that a display can produce in a single image. Often dynamic range is used as a product spec and inappropriately labelled as contrast ratio; however, the true contrast ratio of any display device is lower than the dynamic range. It is measured by displaying a black and white checkerboard pattern and observing the brightness at the centres of the white squares and black squares.


How are your displays 10x darker and 30x brighter?
Most displays today cannot produce a true black. This is for varying reasons, such as a constant backlight for flat-panel displays, but our displays can produce a true black. A full black screen will emit no light at all, 0 cd/m2, and even a black and white checkerboard pattern produces very little light in the black regions. At the same time, our HDR displays can produce a white which is 30 times brighter than that produced by a conventional display. This is accomplished by a high-quality LCD and a new type of backlight that is far brighter than traditional backlights.


How much power does your HDR display use in comparison to a traditional LCD display dislaying appropriate typical interior and exterior video footage?
Power consumption in the HDR display is first order linearly related to the average image brightness. We are using 759 LED at 1 Watt each so the highest total power consumption of the display would be approximately 759W (plus some for the drive electronics, etc). The system is not designed to do this and in fact has build in methods to prevent this from happening.
Almost all images have most of their content in the luminance range of normal displays (which corresponds to the bottom 5-10% or so of the HDR display) and only a few highlights and bright areas. We have taken random samples of images from the web (mixed indoor and outdoor) established the average power consumption to be in the range of 150-80W. Outdoor scenes are generally in the higher end of this range and indoor at the lower end. Of course, a lot of images are lower than this range such as dim indoor images and all conventional 8-bit images (unless extended in some way). Few images exceed this range, as they would be unpleasant to the eye.
This is comparable to CRTs of the same size and slightly higher than LCD power consumption. Obviously, reducing the peak luminance, currently 30x higher than any other conventional display, can shift this relationship.

randycat99 said:
(to pce) Are you perhaps suggesting a more adaptive digital video model that has a typical 100:1 CR but also includes a global brightness scaling factor that can move that 100:1 range from the very bottom of the target 40,000:1 range to the very top (along with setpoints in-between, of course)? I guess I could see that (no pun intended). The 100:1 part of the digital code would save a few bits, and then you have a single brightness scaler to specify a gain range of 400:1? So maybe you need 4 bits for the first part and 9 bits for the 2nd part? ==> 4x3 primary colors + 9 ==> 21 bits?

Yes something like that.
 
PC-Engine said:
lb, for all intents and purposes CR = number of brightness levels of each pixel. Also it is NOT the number of greys. There's a difference between the number of greys and the number of brightness levels. You can have a display that's only capable of black or white, but can have each of those black or white pixels have independent brightness levels of any arbitrary number. There is ZERO point in rating a display with a CR of say 1000:1 if the display isn't capable of 1000 levels of brightness for each individual pixel. For example nobody cares if a display has a 1000:1 CR if for any particular scene if it can only display the brightest white and darkess black with nothing in between.
Telling somebody that a display is capable of 40,000 different brightness levels just because of it's dynamic range is very wrong and highly misleading. The "dynamic range" is also known as on/off constrast ratio. I described the process to measure it some posts before. The definition to "contrast ratio" from your link (actually, where is your link?) is a standard specified by ANSI. To avoid confusion this is usually referred to as ANSI CR. By neither definition does the contrast ratio make any claims or references to the levels of brightness the display is capable of.

Obviously nobody would buy a television that could only display full bright and full off pixels, but a contrast ratio 1000:1 doesn't tell the user anything about the panels ability to vary the brightness levels between full on and full off. The contrast ratio (on/off or ANSI) is simply a measurement of maximum and minimum brightness. Nothing more, nothing less.
 
KnightBreed said:
Obviously nobody would buy a television that could only display full bright and full off pixels, but a contrast ratio 1000:1 doesn't tell the user anything about the panels ability to vary the brightness levels between full on and full off. The contrast ratio (on/off or ANSI) is simply a measurement of maximum and minimum brightness. Nothing more, nothing less.

There is no such display right? Nothing that is only full on or full off?

CR may be deceptive but as a rule, a display with a greater CR has a wider range.

Nobody is really putting out other metrics to reflect the range of brightness levels.
 
Darkroom full screen contrast ratio of a CRT is essentially infinite if you turn brightness low enough, that's true for all the active emittor displays.
 
Back
Top