I just.Don't worry, all the current 4K sets are going to get pushed out (become cheaper) with the pending arrival of HDR (buzzword) 4K sets.
I can't even.
Just.
Can't.
I just.Don't worry, all the current 4K sets are going to get pushed out (become cheaper) with the pending arrival of HDR (buzzword) 4K sets.
I do agree with him. HDR output would have much higher impact on image than a few extra pixels. However the camera technology needs to advance in order to make this possible. And we would need floating point image processing + storage to all the video/image editing softwares. Color correcting HDR content is also different (standard LUT techniques do not work work, since both the input and output luminance data ranges are huge).Don't worry, all the current 4K sets are going to get pushed out (become cheaper) with the pending arrival of HDR (buzzword) 4K sets.
I do agree with him. HDR output would have much higher impact on image than a few extra pixels. However the camera technology needs to advance in order to make this possible. And we would need floating point image processing + storage to all the video/image editing softwares. Color correcting HDR content is also different (standard LUT techniques do not work work, since both the input and output luminance data ranges are huge).
Who better than us to create HDR? Shouldn't be our eyes the ones naturally creating the HDR instead of those TVs? I mean, I agree with Shifty -from another thread- that those technologies are just tricks to mimic real life, but it should be us, not the TV performing HDR and behave like our human iris, I think?Don't worry, all the current 4K sets are going to get pushed out (become cheaper) with the pending arrival of HDR (buzzword) 4K sets.
I couldn't imagine that technology could be so important in the future. In fact it sounded a bit like science fiction, because I remember a fellow forumer posting a video of a 32" HDR TV which was meant to show its HDR technology than competing with 4k sets. I have a 32" TV and well.., it's fine, but it's small by today's standards, so I thought that well be either a fad without even the bells and whistles or something for a lucky minority, a too expensive technology.I do agree with him. HDR output would have much higher impact on image than a few extra pixels. However the camera technology needs to advance in order to make this possible. And we would need floating point image processing + storage to all the video/image editing softwares. Color correcting HDR content is also different (standard LUT techniques do not work work, since both the input and output luminance data ranges are huge).
In this particular context, we're referring to TVs actually displaying a wide range of luminances.Who better than us to create HDR? Shouldn't be our eyes the ones naturally creating the HDR instead of those TVs? I mean, I agree with Shifty -from another thread- that those technologies are just tricks to mimic real life, but it should be us, not the TV performing HDR and behave like our human iris, I think?
And where do OLED sets fit into all this?
Is there a standard on how bright this HDR TV can display? Am I going to close my eyes when there is a car scene with the camera looking directly at the headlights? Can I be blinded when seeing a recording of solar Eclipse? Do I need to wear a shade when watching those scene? Can I record (with HDR cam of course!) a green laser pointed directly at the camera and play it back in the HDR TV hoping the viewer would be blinded?
Seems like it'll depend on the environment it's in. In terms of the display results, LCDs do have one advantage in that they can sometimes be cranked up to be viewable in bright rooms. In a darker environment where people's eyes can acclimate to the level of the screen, a CRT or plasma or OLED can achieve a much more lively picture, even if the maximum light output isn't as high.Oled does contrast amazingly well, one set reviewed had infinite contrast as they could not get a reading for its black.oled as a tech has problems getting very bright atm which might affect its HDR performance , but given its near perfect black levels per pixel brightness this is mitigated atm.
that's over nine thousand!!!!
anyway, while I don't mind a display that can output 10000 nits, I do wonder how much electricity will it use, especially if the whole screen becoming bright. If current display are 300-400nits, that's a lot of difference in luminance, thus a lot of difference in power consumption. It might trip my house breaker if the TV blasted those bright 10000 nits image.... And I imagine in the theater when there is a very bright scene, it will illuminate the studio brighter than the studio light itself.
Of course a good director would limit the effect to some localized effect like street bulb, candle, far away lightning, etc... but I don't think the masses would be that bright....
Seems like it'll depend on the environment it's in. In terms of the display results, LCDs do have one advantage in that they can sometimes be cranked up to be viewable in bright rooms. In a darker environment where people's eyes can acclimate to the level of the screen, a CRT or plasma or OLED can achieve a much more lively picture, even if the maximum light output isn't as high.
My current and last Sony TV have had this. But it is crazy that isn't standard by now, it's been common in laptops for the best part of a decade and most smartphones for years.I'd also like to see an ambient light meter built in to the display that would on startup and on user command check the ambient light in the room and adjust the display output to compensate.
Seems like it'll depend on the environment it's in. In terms of the display results, LCDs do have one advantage in that they can sometimes be cranked up to be viewable in bright rooms. In a darker environment where people's eyes can acclimate to the level of the screen, a CRT or plasma or OLED can achieve a much more lively picture, even if the maximum light output isn't as high.