LCD's are somewhat poo for black levels though, so I don't think the best measure. Also, I'm not sure how it filters into ur calculations, but 1300:1 ANSI (which except for Sony Bravia XBR3, NO lcd has reached quite yet) on a Bravia XBR3 still reveals worse shadow detail than a Panny 8gen plasma with a ~1000:1 ANSI rating. Afaik.
Yeah, when you have bad contrast you're sort of stuck between a rock and a hard place. If you want to maximize the accuracy of as many colours and shades as possible, then just going as black as possible for the shades you can't reach is optimal. Of course, that leads to black crush, so you have no difference in luminance between dark shades. If you want to maintain shadow detail (i.e. show a visible difference between shades), you have to boost the luminance of dark greys to form a ramp, forcing some colors further away from what you're capable of.
In an arbitrary example, imagine trying to get shades from 0-100, but your display can only go as low as 10. Now with inputs 0-9, you can output 10, so that 10-100 come out perfectly. Of course, now you see no difference between shades 9 and below. To fix this, you can map 0 --> 10, 1 --> 10.5, 2 --> 11, .... , 19 --> 19.5, and from 20 and above you map perfectly. Now you can see the difference between low inputs, giving you shadow detail, but inputs 10-19 are brighter than they should be even though your display can show them correctly.
Each manufacturer will place a different value on each solution. Crushing blacks and losing shadow detail gives you more subjective contrast in a bright image.
edit: I realize that the above doesn't have much to do with theoretical resolution power. Do you have a link to info that correlates value scale (grey value 10 on what standard scale) with ANSI contrast ratio?
Oh I was talking about 10 out of 255 to be consistent with my previous example. Usually they talk about IRE levels (0-100) which corresponds to the voltage going through the video signal. It gets kind of messy too with IRE7.5 being black for NTSC, and clamps on the 0-255 range too depending on how it's mapped to the video signal (i.e. 0-16 is black and 235-255 is black).
I like to look at it as a fraction to avoid this nonsense, with 0.0 being black and 1.0 being white. Gamma is just the relationship output=input^2.2. So the luminance of a signal at 0.04 should be 0.00084, i.e. 1/1200th of full output. If you use all 8-bits per channel in the active range of a display, then a value of 10 out of 255 is a display input fraction of 0.04. That's what I was trying to illustrate.
I'm not quite sure what you mean by a chart from grey scale to ANSI contrast, but hopefully this post helps. I was just saying that if you really want to accurately display 1-255 (i.e. 0.004 to 1.0) according to the standard of 2.2 gamma, then you need a display with 200,000:1 contrast ratio.
LCDs still have two orders of magnitude to go before they legitimately need a higher bit depth in the signal. I know HDR is all the rage, but 8-bit and 10-bit has a lot more range than people give it credit for, at least as a final image format.