Idei san : Technological visionaries are screwing Sony...

Status
Not open for further replies.
So what you're saying is, you prefer to watch movies on low contrast ratio devices where you cannot see shadow detail? Where in a dark scene, everything is too dark to see detail that was on the original film print?

CR and color saturation are the biggest determinants of IQ. Brightness is also good, but can be accounted for if you can control ambient light (e.g. you have a dark theater room, black out curtains, etc). But any CR at 600:1 or lower simply stinks and no one into home theater would seriously consider it.
 
What I'm saying is that I prefer the image quality that a cinema provides. It's not razor sharp. It's not overly bright. It's not high contrast. It has nice resolution. The colors are not overly saturated.

Like someone mentioned at the AVS forums, after a display is properly calibrated, the contrast will be toned down anyway.

Anyway on my PC I prefer the Theater color setting in PowerDVD.
 
The fact is black and shadow detail is still the problem with LCD displays and projectors. There is no true black on LCDs.

For the contrast ratio, you can tone down your display when it has the contrast ratio, but you cannot bump up the contrast when it simply cannot give that much.

For sure there are always bad ones in any display technology, but comparing the display characteristics, the best CRT is always better than the best Plasma, and the best Plasma is always better than the best LCD.
 
Sure but LCDs will continue to improve in CR while still retaining certain benefits of LCD techology and like I said I prefer the cinema look which don't display true blacks either ;)

BTW the only instance in real life where you'll see true blacks is when there's no light, however, every film needs light unless you're filming using DV and infared ;)
 
Sure but LCDs will continue to improve in CR while still retaining certain benefits of LCD techology and like I said I prefer the cinema look which don't display true blacks either

are we assuming that plasma and other alternatives stay stagnent in a vacumm then?
 
notAFanB said:
Sure but LCDs will continue to improve in CR while still retaining certain benefits of LCD techology and like I said I prefer the cinema look which don't display true blacks either

are we assuming that plasma and other alternatives stay stagnent in a vacumm then?

Umm...no but why would you need a CR that's higher than what your eyes can resolve? ;)

Similary why do you think there's a 16.7 million color threshold?

On an LCD you can improve CR while you don't need to on a plasma because it has almost reached the limit. How about resolution? Has there been any plasmas that display 200 dpi? ;)

On a related note Sharp has a 800:1 CR LCD already. Next year I predict 1000:1...so you see LCDs are catching up fast while still retaining low power, low heat, light weight, zero burn-in.
 
CR has nothing to do with how many colors the eye can resolve. It has to do with dynamic range. The human eye can resolve light levels beyond the range of display devices and even film. A human being can look at a brightly lit scene and resolve both the highlights and the shadows with detail depending on where they pay attention. Our contrast ratio is in the 1000s to 1 or 10000s to 1. Film is in the 100s to one. The dynamic range of film is one of the main reasons for the "film" look vs video. Video cameras have a contrast ratio of about 50:1

On an LCD, a candlelight or moonlit night scene appears too dark. It's not a matter of "true black levels", it's a matter of whether things appear without detail if they are dark.

So here's the rub. Let's say someone has filmed a scene and used the entire range of the film to capture a low-light scene. Your display device must not only be capable of displaying highly lit scenes, it must be capable of displaying the entire range of possible scenes (low light, brightly lit, etc) Most display devices will do well on the brightly lit scene, but will do very badly on the low light scene.

This is where the subtractive technologies like LCD fall down. They cannot produce *rich blacks*. And if you don't think this matters, take a course in black and white photography to see what a difference it makes.
 
DemoCoder said:
On an LCD, a candlelight or moonlit night scene appears too dark. It's not a matter of "true black levels", it's a matter of whether things appear without detail if they are dark..

The panels are indeed improving in this area, the but still cannot match the reflective ones, and for sure not able to compare to Plasma and CRT.

Panasonic has the black chip that enhances the black details before sending to the panel. I suppose other companies will be doing similar stuffs to work around on the limitations of the panels.

"black and shadow detail" and "true black" are two things, I had just put them into one line (but there were 2 sentenses).
 
The reason why I brought up 24-bit color is that the human eye cannot resolve more than 256 shades of any single color. Even though black and white are not colors they still follow the same principle, therfore a 256 greyscale bar from black to white is the maximum range the human eye can detect. However the typical person can only differentiate between 50 shades of grey or 50 levels of brightness. A LCD with a CR of 1000:1 can easily resolve a 50 shade greyscale bar. As a matter of fact, medical use greyscale LCDs can display 1024 shades of grey simultaneously.

From two differenct reviews:

The L685 exhibited no ghosting and reproduced our complex color wheel with richer detail than most plasma monitors we’ve reviewed.

We were immediately impressed with the CG18's colour and greyscale ramp performance - the best we've ever seen from a TFT.
 
PC-Engine said:
The reason why I brought up 24-bit color is that the human eye cannot resolve more than 256 shades of any single color. Even though black and white are not colors they still follow the same principle, therfore a 256 greyscale bar from black to white is the maximum range the human eye can detect. However the typical person can only differentiate between 50 shades of grey or 50 levels of brightness. A LCD with a CR of 1000:1 can easily resolve a 50 shade greyscale bar. As a matter of fact, medical use greyscale LCDs can display 1024 shades of grey simultaneously.

then why to dark scenes show notably differences (personnel experience with mid-range LCDs here)? whats the relation?

in other words whats really happening here? ideas?
 
Even if the human eye could only resolv 4 colors: 0, 256, 512, and 1024, a scene that included all 4 pixels would still be beyond the CR ratio of most displays. There is white, and then there is bright white. LCD's in grayscale simply won't reproduce the correct black levels near black.

The other problem of course, is that LCD's cannot mimic the exact gamma curve, so the colors they reproduce are not the gamut that the human eye is limited to.
 
Even if the human eye could only resolv 4 colors: 0, 256, 512, and 1024, a scene that included all 4 pixels would still be beyond the CR ratio of most displays. There is white, and then there is bright white. LCD's in grayscale simply won't reproduce the correct black levels near black.

The human eye could not see shade 512 and 1024 of the same color, therefore you're back to 256 and 0 which is within the capablilites of LCDs. One of the best LCDs EIZO Flexscan only have a CR of 400:1 and it rivals CRTs and beats most plasmas from reviews I've read.


We can still see banding in 24bit images, so that's not really true.


Actually it's true, but it depends on the display device.
 
Recapping the thread thus far:

A: Sony is dying because of blockheads wasting money on PS3 instead of profitable stuff like flat panel.

B: You're nuts, Playstation is the most profitable thing they have. Besides, Sony doesn't need leadership in LCD's, its leapfrogging to OLED and plasma.

A: But LCD rules right now, and will only get better. OLED and plasma aren't as good. What about burn-in and all that?

B: Well, what about refresh rates, CR, and bad pixels? LCD is not better.

A: Is too!

B: Is not!

(...)

As people here who have actually studied computer graphics undoubtably know, no RGB-based screen can display the full range of color the eye can capture, since you can't draw a Maxwell color triangle on the CIE XY plot that encompasses the full plot.

Contrast Ratio - we're not even close yet.
The rods and cones in your eyes have a sensitivity difference of 1:10000. Add the filtering of your brain and the variable aperture of your pupil, and your eye can "see" a range of 14 orders of magnitude (1:100,000,000,000,000). But not all of this range is practically useable. A good estimate of the usable range is the luminosity from starlight to sunlight. That's 8 orders, or (1:100000000).

Color discrimination:
There are three parts it: spectra(hue), chroma, and luminosity.

Spectra:
The human eye can distinguish between two spectra of light that are one nanometer apart in the middle ranges of the visual spectrum, usually defined to be about 400-650nm.

Chroma:
At maximum sensitivity at 470nm, the eye gets about 0.7 log. (I don't know much about how they test this, so don't ask me what it means)

Luminosity:
At an average luminosity, the eye can detect changes in luminance as small as 3%. At higher luminosities, the eyes log sensitivity gets better; at lower luminosities, the sensitivity gets worse.

Conclusions:
The eye definitely can resolve more than 256 shades of one color (one hue). The photopic range (color range) of the eye is 6 orders of magnitude. At 3% apart, 256 colors would take you to about 3.5.

The eye definitely can resolve more than 50 shades of gray. The eye's B&W range is the full 14 orders of magnitude.

It seems to me that the eye has quite a bit more range than 24-bit color, or your average plasma or OLED, not to mention LCD.
 
Put two consecutive shades of the same color out of a 256 shade pallete next to each other and see if YOU can see the difference. The proof is in the pudding. We're not interested in how many photoreceptor cells the eyes contain, we're interested in what those can actually detect in the real world in real conditions.

BTW SONY doesn't manufacture 40" and bigger plasma displays they buy them from Fujitsu or NEC therefore they're not leapfrogging into plasmas ;)
 
PC-Engine said:
Put two consecutive shades of the same color out of a 256 shade pallete next to each other and see if YOU can see the difference. The proof is in the pudding.

BTW SONY doesn't manufacture 40" and bigger plasma displays they buy them from Fujitsu or NEC therefore they're not leapfrogging into plasmas ;)

No problem. We're not talking about two paints, we're talking about light. (In case you forgot, flat-panel TV's give out light). Give me 256 graduations between a faint red star and standing four inches in front of a landing-approach-strobe light, and I can tell the difference between any two of them.
 
nondescript said:
PC-Engine said:
Put two consecutive shades of the same color out of a 256 shade pallete next to each other and see if YOU can see the difference. The proof is in the pudding.

BTW SONY doesn't manufacture 40" and bigger plasma displays they buy them from Fujitsu or NEC therefore they're not leapfrogging into plasmas ;)

No problem. We're not talking about two paints, we're talking about light. (In case you forgot, flat-panel TV's give out light). Give me 256 graduations between a faint red star and standing four inches in front of a landing-approach-strobe light, and I can tell the difference between any two of them.

How about I draw you a picture using shade 300 of white and shade 301 of white you think you can see what the picture is? :LOL: :p

BTW if your idea of watching DVDs is to stare into a flashlight, then I sugguest you calibrate your display device ;)
 
Look PC-Engine, it is well known that no display currently on the market can produce the gamut that the human eye sees. This is scientific fact, and the work that the ICC did since 1931 is based on this fact. This is basic computer graphics theory. You can go out and buy a colorimeter or photospectrometer and do the tests yourself to prove it, and after you measure the range of an LCD's gamut, you'll see that it doesn't come close to matching the human gamut. No display can match the eye's color space. You are totally misunderstanding the problem by looking at RGB or how many pixel variations you can display. It's totally irrelevent.


The 16 million color argument is an urban myth. Let's say that the eye is capable of seeing colors from 0.0 to 1.0 on a real line. Your LCD display may be able to reproduce colors from 0.2 to 0.7, and it may have extremely fine detail between those ranges (say, 16 million different colors between 0.2 and 0.7, each color spanning 0.5/2^24 interval), but it misses half the colors the eye can see. In truth, the eye might not even be able to see more than 7 million distinct colors, but the colors where the LCD is concentrating all of it's resolution in are wasted, because it's a small range.

An LCD could produce a BILLION shades of red alone (let's say, 32-bits for red, 32-bits for green, and 32-bits for blue) and you'd still be wrong, because all 4 billion shades would be in a narrow range.

The eye's color system might have less resolution (but greater range) than a typical 16 million color display, but the eye's luminance system has way more resolution and range. Human beings have been tested to be able to detect less than 10 photons, which is extremely dim light, and they can also see in extraordinary bright light that would way overexpose any camera, even if it's been stopped down and set to 1/4000th of a second exposure. Of course, the usable range is well over 10 photons (overwise, we wouldn't need military night vision), but the overall range blows away any display.

You think any display on the market can even begin to approximate (via say, showing you an image) that compares to your experience in a dimly lit room or moonlit night? No low light scenes on an LCD have I ever seen even get close to the rich creamy shadows of a candlelit scene.


BTW, here is a color gamut comparison of LCD vs OLED. LCD is the solid center, OLED the mesh

oled2003PFig1.jpg
 
PC-Engine said:
How about I draw you a picture using shade 300 of white and shade 301 of white you think you can see what the picture is? :LOL: :p

BTW if your idea of watching DVDs is to stare into a flashlight, then I sugguest you calibrate your display device ;)

Stop putting words in my mouth.

If you had a light that changed from 300 to 301 white, I could tell that the lighting level had changed, and that's color discrimination. 300 to 301 doesn't provide a lot of contrast, so you're right, its hard to make out an image. That's why we want more contrast. Obviously.

Anyways, I would like a screen to be as bright as the real thing they filmed. If its a moonlit scene, then give me that level of lighting. If its a flashlight, then I hope the screen is bright enough to accurately display that too. Whatever the movie camera captures, ideally, my screen should be able to display - including staring into a flashlight.

Give it up. Instead picking minor details in my posts and putting words in my mouth, or, when you can't do that, cracking senseless jokes, why can't you just admit the eye can discriminate between more that 50 grays, 256 shades of a color, and that the eye can handle much, much more contrast than any screen can provide? (Oh wait, never mind, that would require you to change your mind which is about as likely as flying rhino ;) )

Keep your LCD. If its so great, and your eyes can't tell the difference between the LCD and real life, that's wonderful, you just saved yourself a lot of money.
 
Status
Not open for further replies.
Back
Top