Digital Foundry Article Technical Discussion Archive [2015]

Status
Not open for further replies.
Don't worry, all the current 4K sets are going to get pushed out (become cheaper) with the pending arrival of HDR (buzzword) 4K sets.
I do agree with him. HDR output would have much higher impact on image than a few extra pixels. However the camera technology needs to advance in order to make this possible. And we would need floating point image processing + storage to all the video/image editing softwares. Color correcting HDR content is also different (standard LUT techniques do not work work, since both the input and output luminance data ranges are huge).
 
I do agree with him. HDR output would have much higher impact on image than a few extra pixels. However the camera technology needs to advance in order to make this possible. And we would need floating point image processing + storage to all the video/image editing softwares. Color correcting HDR content is also different (standard LUT techniques do not work work, since both the input and output luminance data ranges are huge).

Dolby are trying to pull all this together with Dolby Vision.

Whitepaper
 
Don't worry, all the current 4K sets are going to get pushed out (become cheaper) with the pending arrival of HDR (buzzword) 4K sets.
Who better than us to create HDR? Shouldn't be our eyes the ones naturally creating the HDR instead of those TVs? I mean, I agree with Shifty -from another thread- that those technologies are just tricks to mimic real life, but it should be us, not the TV performing HDR and behave like our human iris, I think?

Still, I am not in a hurry to get a 4K TV, patience will pay off, 'cos I got a 3D Full HD TV a year ago and I very happy with it til next generation makes 4K standard, then we can rejoice..
 
I do agree with him. HDR output would have much higher impact on image than a few extra pixels. However the camera technology needs to advance in order to make this possible. And we would need floating point image processing + storage to all the video/image editing softwares. Color correcting HDR content is also different (standard LUT techniques do not work work, since both the input and output luminance data ranges are huge).
I couldn't imagine that technology could be so important in the future. In fact it sounded a bit like science fiction, because I remember a fellow forumer posting a video of a 32" HDR TV which was meant to show its HDR technology than competing with 4k sets. I have a 32" TV and well.., it's fine, but it's small by today's standards, so I thought that well be either a fad without even the bells and whistles or something for a lucky minority, a too expensive technology.
 
If HDR tv works, it won't be a gimmick. The idea is to have incredibly fined-grained local dimming/brightness, so you can have something like a candle glow on screen without it messing up the brightness of the pixels around it inappropriately. That means you get high contrast, realistic brightness and an expanded colour gamut.

Think of what JJ Abrams will be able to do with lens flares!!!!
 
Think of what JJ Abrams will be able to do with lens flares!!!!

flare.jpg
 
Who better than us to create HDR? Shouldn't be our eyes the ones naturally creating the HDR instead of those TVs? I mean, I agree with Shifty -from another thread- that those technologies are just tricks to mimic real life, but it should be us, not the TV performing HDR and behave like our human iris, I think?
In this particular context, we're referring to TVs actually displaying a wide range of luminances.

Imagine deep blacks and vibrant highlights. It'll be like a decade ago!
 
Is there a standard on how bright this HDR TV can display? Am I going to close my eyes when there is a car scene with the camera looking directly at the headlights? Can I be blinded when seeing a recording of solar Eclipse? Do I need to wear a shade when watching those scene? Can I record (with HDR cam of course!) a green laser pointed directly at the camera and play it back in the HDR TV hoping the viewer would be blinded?
 
And where do OLED sets fit into all this?

Oled does contrast amazingly well, one set reviewed had infinite contrast as they could not get a reading for its black.oled as a tech has problems getting very bright atm which might affect its HDR performance , but given its near perfect black levels per pixel brightness this is mitigated atm. Going forward it might be harder to compete in the numbers game but certainly is the tech to beat at a nerd level.

Is there a standard on how bright this HDR TV can display? Am I going to close my eyes when there is a car scene with the camera looking directly at the headlights? Can I be blinded when seeing a recording of solar Eclipse? Do I need to wear a shade when watching those scene? Can I record (with HDR cam of course!) a green laser pointed directly at the camera and play it back in the HDR TV hoping the viewer would be blinded?

Dolby are pushing their "Vision" HDR technology for the new 4k standard which will include HDR as well as a new expanded colour space. For brightness I believe current sets are 300 or 400 nits, the upcoming HDR sets this year are more like 2000 I believe but dolby has said from customer focus groups upto 10000 is well received so I think that is the current "cap".
That is bright and yes direct sun shots could be uncomfortable if they wanted it to be, this will require director direction I assume but with subtle use could make scenes far more life like.
 
that's over nine thousand!!!!
anyway, while I don't mind a display that can output 10000 nits, I do wonder how much electricity will it use, especially if the whole screen becoming bright. If current display are 300-400nits, that's a lot of difference in luminance, thus a lot of difference in power consumption. It might trip my house breaker if the TV blasted those bright 10000 nits image.... And I imagine in the theater when there is a very bright scene, it will illuminate the studio brighter than the studio light itself.
Of course a good director would limit the effect to some localized effect like street bulb, candle, far away lightning, etc... but I don't think the masses would be that bright.... :)
 
Oled does contrast amazingly well, one set reviewed had infinite contrast as they could not get a reading for its black.oled as a tech has problems getting very bright atm which might affect its HDR performance , but given its near perfect black levels per pixel brightness this is mitigated atm.
Seems like it'll depend on the environment it's in. In terms of the display results, LCDs do have one advantage in that they can sometimes be cranked up to be viewable in bright rooms. In a darker environment where people's eyes can acclimate to the level of the screen, a CRT or plasma or OLED can achieve a much more lively picture, even if the maximum light output isn't as high.
 
that's over nine thousand!!!!
anyway, while I don't mind a display that can output 10000 nits, I do wonder how much electricity will it use, especially if the whole screen becoming bright. If current display are 300-400nits, that's a lot of difference in luminance, thus a lot of difference in power consumption. It might trip my house breaker if the TV blasted those bright 10000 nits image.... And I imagine in the theater when there is a very bright scene, it will illuminate the studio brighter than the studio light itself.
Of course a good director would limit the effect to some localized effect like street bulb, candle, far away lightning, etc... but I don't think the masses would be that bright.... :)

I was curious about power as well. I read that one of the HDR tvs had something like thousands of LEDs that can be addressed individually, so they can do incredibly fine-grained brightness control. But if you have a scene that's mostly bright, for expample a large fire that takes up most of the screen, how much power is the tv drawing? It's a real consideration.
 
Seems like it'll depend on the environment it's in. In terms of the display results, LCDs do have one advantage in that they can sometimes be cranked up to be viewable in bright rooms. In a darker environment where people's eyes can acclimate to the level of the screen, a CRT or plasma or OLED can achieve a much more lively picture, even if the maximum light output isn't as high.

One thing I'd like to see is for higher end TVs to start including auto-calibration features similar to what is now common with AV receivers. They would include an image sensor, similar to the microphones included with the receivers, that would be held towards the screen while test patterns were generated. I'd also like to see an ambient light meter built in to the display that would on startup and on user command check the ambient light in the room and adjust the display output to compensate.
 
I'd also like to see an ambient light meter built in to the display that would on startup and on user command check the ambient light in the room and adjust the display output to compensate.
My current and last Sony TV have had this. But it is crazy that isn't standard by now, it's been common in laptops for the best part of a decade and most smartphones for years.
 
Seems like it'll depend on the environment it's in. In terms of the display results, LCDs do have one advantage in that they can sometimes be cranked up to be viewable in bright rooms. In a darker environment where people's eyes can acclimate to the level of the screen, a CRT or plasma or OLED can achieve a much more lively picture, even if the maximum light output isn't as high.

Is dimness a limitation of OLED or a power use issue with OLED? I know Samsung AMOLED phones used to not get as bright as top phone LCD's (although they were fine). But their newest amoled screens seems able to equal or surpass the brightness of LCD's in phones.

Also in the past I read it wasn't the Oled's fault they were dimmer per se, Samsung would clamp the max brightness down to save battery life. Although that of course would be because the Oled's were using too much power at top brightness compared to LCD's. One would think this would be less of an issue on TV's. Anyways with all the environmental regulations getting stricter power use would be a factor even in HDTV's. It partly helped to do away with Plasma for example.
 
Status
Not open for further replies.
Back
Top