HDR Displays?

Mintmaster said:
Brimstone said:
That is really cool Sony already has working products based on this technology.

Let the hype begin for the ultimate home entertainment combination: a PS3 and HDR HDTV with CELL.
I don't think this TV has HDR in mind (though it seems to be a natural extension). LCD's have quite poor contrast ratios (and hence poor black levels), and this would circumvent that problem. It also helps with the colour reproduction.


It definatley uses a LED backlight but it doesn't seem to be a HDR display unfortunatley.

At the top of the range is a 46-inch LCD model that the company will sell under its Qualia brand name. The model uses a panel produced by Samsung Electronics that is currently the world's largest in commercial production and, perhaps more importantly, 1 inch larger than the biggest panel being produced by Sharp.

The Qualia 005 TV is also the first in the world to use an LED backlight, which Sony says helps the set deliver a superior picture with truer and richer colors.

That appears to be true. At the unveiling, the new Qualia set was displayed alongside a competitor's model, unnamed but unmistakably a Sharp set because of its case style, and the Qualia set did appear to produce a better picture, at least for the sample images used in the demonstration.

"In the sense of technology we have completely caught up [with Sharp]," Kogure says.

Ken Kutaragi, executive deputy president and chief operating officer of Sony, went one further and said his company had not only caught up with Sharp, but overtaken it. With Sony committed to pushing more networking technologies and with its experience in semiconductor technologies, from now on Sharp will always be playing catch-up with Sony and not vice versa, he says.

Sony's TV plans take shape

The LED technology called Triluminos wasn't developed by Sony. They used Lumileds' Luxeon LEDs in the TV apparently.

Sony's QUALIA 005 and the Triluminos backlight are the latest examples of never before possible applications enabled by Luxeon," said Menko de Roos, Executive Vice President of New Business Development at Lumileds. "Consumer expectations will forever be changed by Sony's implementation of Luxeon LEDs and the on-screen performance that viewers are treated to. We are pleased to have worked with Sony on this groundbreaking technology."

Lumileds' Luxeon LEDs Power World's First LED-Based TVs from Sony and Deliver the Richest, Most Saturated Color Ever Seen on a Television


Lumileds
 
Mintmaster said:
Fred da Roza said:
Just curious, but have you actually seen a HDR display and do you have a link? I've seen requirements for the military to output 16 bit monochrome video to simulate sensors but never seen a commercially available HDR display.
No, I haven't seen one. I'm just assuming that if this technology does take off, that's probably where it'll be used first. They could probably simulate outdoor environments better for pilots. I doubt we'll see HDR in the consumer space for at least a decade, because you need a new signal standard.

No standards for fiber?
 
Fred da Roza said:
No standards for fiber?
Not sure what you mean, but let me clarify what I meant.

You need a new signal standard to be adopted by the electronics industry for recording, editing, broadcasting/transmission, and playing back at home. HDTV is just starting to enter mainstream now, so the next major change in video signal standards will be a long way off.
 
Xmas said:
The difference between VR goggles and reality isn't perceiving two different images, but that those images are fixed on a single focal plane. You can't focus on an object to see it in more detail. All objects are as sharp as it gets, and your eyes are permanently focused on something very close.
This might be a cause of headache for some, but I don't see this being any different with any kind of single-eye dominance.
That depends on the focal length of the lens that's projecting the image into your eye. On any VR headset made by a competent person the "screen distance" will be set so that it's comfortable.
Actually F1 drivers had a similar problem with the HUD displays in their helmets some years ago. They had to refocus every time they looked at it, and thereby taking attention away from the road for a second. The problem was solved by inserting a different lens between display and eye.
 
Mintmaster said:
Fred da Roza said:
No standards for fiber?
Not sure what you mean, but let me clarify what I meant.

You need a new signal standard to be adopted by the electronics industry for recording, editing, broadcasting/transmission, and playing back at home. HDTV is just starting to enter mainstream now, so the next major change in video signal standards will be a long way off.

I would guess that TV is the last medium that will adopt HDR but how about monitors? I consider that mainstream and I would guess you will see it there before TV.

In conjunction with 16 bit video, I've also seen requirements for video via fiber optic cables. There are some commercially available cards that can convert DVI to fiber. Is there a standard for this?
 
Mintmaster said:
I think we're barking up the wrong tree with these HDR displays. I don't see the average computer user finding much use in them (though the contrast ratio is nice compared to todays LCD's), so it seems like it'd be a niche product.

The fact is we are very satisfied with the portrayal of reality provided by TV's and movie screens, which don't produce extremely high brightness when showing things like the sun. Sure, CRT's are indeed capable of essentially infinite contrast ratio, but other display technologies aren't, and they're generally more desireable, showing the consumer doesn't put dynamic range high on their priority list. Add to that the need for a different broadcast format, and I don't see widespread usage in the near future, be it for PC's or home theatre.

We should be aiming for reaching the realism captured by video cameras first. Both hardware and software need to make big advances here. I think HDR displays are mostly just a gimmick, and will remain so for at least a decade (aside from specialized applications, e.g. the military).

If it can reduce power there is an instant market for them regardless of any image improvements. It could be PDAs/cell phones and laptops that drive it into the mainstream.
 
Briareus said:
If it can reduce power there is an instant market for them regardless of any image improvements. It could be PDAs/cell phones and laptops that drive it into the mainstream.
Again, that doesn't really have anything to do with HDR. You're just talking about the use of LED's.

The difference in power consumption would be very low, AFAICS. In cell phones you generally have a white/lit background, so most/all of the LED's will be lit anyway. For laptops, black text on a white background is standard for word processing and web pages, so you'll only save power sometimes.

I think the contrast ratio and colour improvements are more important benefits of this technology. Not only that, but response time should be better since the eye is much better at tracking luminance changes, and LED's change intensity rapidly. With the reduced resolution of the LED array, though, this effect is probably limited.
 
Hey, what's up. I've been lurking around these forums for some time, but never got my lazy self around to registering and commenting. Seeing this discussion on the HDR display, a topic near and dear to me (being one of the authors, and a Sunnybrook employee), I figured this would be an ideal time to register and throw what I know into the fray. It's probably best to just do one monster post replying to everyone's comments as best I am able to.

The basic concept described is right. We take an off-the-shelf LCD panel, and stick a bunch of LEDs behind it. You can control them all separately, so you have a low-res monochrome HDR backlight, and a high-res color LDR front panel. You drive these in tandem to achieve the full dynamic range. The basic idea is a solution for the fact that LCD panels can only block so much light. If you want, brighter brights, without what we do, you get brighter darks as well.


Apple740 : No, it runs as fast as a norwal display with GPU-based image processing.

Bjorn : That sony display is a completely different backlight technology, though it is true that they both use LEDs. The Sony (and NEC has one as well) use 3 different color LEDs to improve the color gamut of the display. (Note that in this case, "colors" in the sense that you display has 16 million "colors" is not the same. I mean color gamut, the most saturated chromacity that can be displayed. This has nothing to do with the intensity of the light.) The HDR display, on the other hand, uses LEDs to create a spatially-variant backlight that can be considerably brighter (and darker) than a normal display. The LEDs used have roughly the same spectra as a coventional LCD backlight, and to nothing to improve the gamut.

aranfell : The lens in your eyes in considerably lower quality than optical glass. If you look at a bright point (roughly 150x brighter than the surround), light leaks over into the dark surround, obscuring the detail. So you can have errors in the surround you can't see.

Remi : Yes, there are fairly effective ways to fake it that look good.

Chalnoth : Because there are only a few LED values, we can pack the entire backlight into a single scanline of the DVI signal.

Mintmaster : All the movie studios process in HDR. They throw away tons of data when they bake a DVD for you. For text on a white background 100:1 contrast is great. I can assure you, at Siggraph, pretty much everyone who saw videos on it wanted one. The real world has more dynamic range than your TV. Many people want to see that.

Brimstone : It'd take any game studio with an HDR render engine less than a week to drop in the SDK and output to the display.

Mintmaster : The dynamic range of film is greater than an LCD monitor. Movie studios are dying for these for their compositors and lighting designers.

Fred da Rosa : Several are under discussion.


Alright. Phew. That's a start on answering some questions. It's all a very simplistic view of what we are doing but is a start. I'll be glad to answer more questions within my ability. I'm trying to get some good comprehensive write-ups on HDR in general, the specifics on the display, and especially the psychophysics involved since that is the most foreign to CS people. Other than that, have a look at the paper if you are interested : http://www.cs.ubc.ca/~mmt/Siggraph.04.pdf
 
squarewithin said:
Mintmaster : All the movie studios process in HDR. They throw away tons of data when they bake a DVD for you. For text on a white background 100:1 contrast is great. I can assure you, at Siggraph, pretty much everyone who saw videos on it wanted one. The real world has more dynamic range than your TV. Many people want to see that.
Mintmaster : The dynamic range of film is greater than an LCD monitor. Movie studios are dying for these for their compositors and lighting designers.
Welcome to the forums, square!

I know LCD's suck at dynamic range / contrast ratio. That's why I've completely written off LCD rear-projection TV's, and am looking at DLP almost exclusively. But the popularity of RP LCD shows that many people don't care about dynamic range. Sony showed a 70" SXRD TV some time ago that uses LCOS, which doesn't have the greatest contrast ratio, yet people were drooling all over it.

For movie studios, why haven't they just been using a high brightness CRT before? They can show perfect blackness, so dynamic range should be extremely good.

Anyway, I was talking about the usefulness of HDR displays for the home consumer. We'd need a broadcasting standard revolution, and that won't happen until HDTV gets a bit old first. It might show up on home computers, but won't get mainstream for ages.
 
Thanks for the welcome. I'll be poking my head around here more often.

But the popularity of RP LCD shows that many people don't care about dynamic range.

Well, maybe when the difference is a factor of 2 or less, not a factor of 200 or more.

For movie studios, why haven't they just been using a high brightness CRT before? They can show perfect blackness, so dynamic range should be extremely good.

Film stock has a dynamic range of roughly 12-14 bits of luminance (I forget exactly). Even if you have a brighter CRT, it's still only 8 bits.
 
squarewithin said:
Alright. Phew. That's a start on answering some questions. It's all a very simplistic view of what we are doing but is a start. I'll be glad to answer more questions within my ability.

I've heard some of the new projection systems will have outrageous resolutions but I haven't heard anything about the precision. Do you know what precision LED, LCOS and laser projection systems will support?

Will the black levels of LED and LCOS projectors be any better than present day LCDs and how will this be acheived?

Any details on the several?
 
squarewithin said:
But the popularity of RP LCD shows that many people don't care about dynamic range.

Well, maybe when the difference is a factor of 2 or less, not a factor of 200 or more.
The difference is a lot more than that, especially if you compare CRT's with LCD. Yet most people loves the picture quality of LCD's for some reason (sharpness?).

For movie studios, why haven't they just been using a high brightness CRT before? They can show perfect blackness, so dynamic range should be extremely good.

Film stock has a dynamic range of roughly 12-14 bits of luminance (I forget exactly). Even if you have a brighter CRT, it's still only 8 bits.
Not sure where you got that info from. Current CRT's are driven by an analogue signal. Their luminance resolution is limited by the signal feeding them. It's very easy to control electron beam intensity with more than 8 bits of resolution. This HDR display must have a different signal standard, so it would be very simple to put those same electronics in a CRT.

I dunno, maybe you're right about the demand of these displays in the film industry. I just don't see why they didn't do it before. The capability has probably been around for at least a decade.
 
Mintmaster said:
This HDR display must have a different signal standard, so it would be very simple to put those same electronics in a CRT.

I dunno, maybe you're right about the demand of these displays in the film industry. I just don't see why they didn't do it before. The capability has probably been around for at least a decade.

There is a standard called High Definition Serial Digital Interface that uses 10 bit. I believe this LCD can use it.

http://www.ggvideo.com/mar_vr653p-hdsdi.htm

The black level is probably still crap.

Edit

Another model from Marshall
http://www.plasma.com/marshall/lcdRack/vr171p-hd.htm
 
Fred da Roza said:
There is a standard called High Definition Serial Digital Interface that uses 10 bit.

That's still not enough. We need 16. We are looking at new extensions to the DVI spec. Also, I think there are considerable advantages in working with a scene-referred output format, where you send 32-bit FP values down the wire and let whatever display you are working with handle the tonemapping, proccessing, and color space conversion
 
16 bit per component is already in the DVI 1.0 spec. But that's meant to add precision, not range. And you need dual-link for it.
Leaving the tone mapping to the display device seems la nice idea. But FP32 is still overkill for some time to come, because of memory bandwidth constraints as well as DVI bandwidth constraints.
 
Squarewithin, thanks for joining! Great info.

Mintmaster wrote: "I know LCD's suck at dynamic range / contrast ratio. That's why I've completely written off LCD rear-projection TV's, and am looking at DLP almost exclusively."

Good point -- thanks! I was wondering what HDTV technology to look into.

"But the popularity of RP LCD shows that many people don't care about dynamic range. Sony showed a 70" SXRD TV some time ago that uses LCOS, which doesn't have the greatest contrast ratio, yet people were drooling all over it."

70" is impressive by itself. But I bet if they saw it side-by-side with a 60" HDR, they wouldn't even look at the LCD. It's true, to see the HDR demo display at SIGGRAPH was to want it. There's a saying that people can't want what they don't know about -- not always true, but I certainly didn't see what was wrong with the non-HDR images until I looked at the HDR display.

By the way, I was just reading that some color-wheel DLP sets include a "dark green" field to eliminate banding in dark areas, and that some also adjust the brightness of the light source based on the overall brightness of the frame. This is a long way from true HDR, but it shows that the TV manufacturers see improved dynamic range as a real opportunity for improving perceived image quality.

Aranfell
 
Squeak said:
Xmas said:
The difference between VR goggles and reality isn't perceiving two different images, but that those images are fixed on a single focal plane. You can't focus on an object to see it in more detail. All objects are as sharp as it gets, and your eyes are permanently focused on something very close.
This might be a cause of headache for some, but I don't see this being any different with any kind of single-eye dominance.
That depends on the focal length of the lens that's projecting the image into your eye. On any VR headset made by a competent person the "screen distance" will be set so that it's comfortable.
Actually F1 drivers had a similar problem with the HUD displays in their helmets some years ago. They had to refocus every time they looked at it, and thereby taking attention away from the road for a second. The problem was solved by inserting a different lens between display and eye.

But the HUD displays are a single 2D plane. For a full 3D VR display you would need a multifocal lens system combined with an eye-tracking system which would check at which point of the image the user is currently looking at and correctly adjusting the focal length. Or something like that.
 
Back
Top