It can't do it all or it can't do it above a certain frame rate?
2.0 supports 4K RGB and 4:4:4 up to 16 bits but only up to 30 fps. With 4:2:2 or 4:2:0 only support at 50 or 60 fps and above 8 bits.
ehh, better LCD panels have contrast ratio about 3000:1, right ... and HDR TV's with LCD panels are supposed to do 20 000:1 to get Ultra HD Premium certification. IMO not possible without local dimming or should I say local brightening?
Ha, the 930D is hardly a High end model these days specifically due to its Edge lit nature. A Full Array local dimming set such as Sony's 940D, Z9D are vastly superior in producing a better HDR picture still. People also have to remember HDR's extreme peak brightness is a torture test to a TV's ability to reduce blooming and haloing, so an edge lit tv by nature is very limited to faithfully deliver a convincing HDR picture.
Just to say...
After reading this thread I changed my mind from buying 4k+hdr to decent full hd (sony WD75) or even lower end.
Probably hdr premium will never come down to level with mid-budget tvs.
Ha, the 930D is hardly a High end model these days specifically due to its Edge lit nature. A Full Array local dimming set such as Sony's 940D, Z9D are vastly superior in producing a better HDR picture still. People also have to remember HDR's extreme peak brightness is a torture test to a TV's ability to reduce blooming and haloing, so an edge lit tv by nature is very limited to faithfully deliver a convincing HDR picture.
I can definitely understand why the quality or wow factor of HDR has been somewhat an ongoing argument and it is mainly due to the set that they've been viewing on. All I can say is that with the right kind of TV, HDR picture is vastly superior to a SDR picture could ever dreamed of. And for the best HDR experience currently available, you should be looking at this.
http://www.hdtvtest.co.uk/news/kd65zd9-201610164372.htm
a "bad" 4K HDR TV can be a very good HD SDR TV.
The question must be: given this realistic budget, is better a 1080p with good contrast, or a 4k with low contrast and nit that make 8 bit hdr indistinguishable from sdr?
My other option was a samsung EU40KU6000 that I discovered be too bad for sub 4k, and not that good for 4k.
(By yesterday evening I was competent in any 2015/2016 samsung's model, and some from lg and hisense )
For the sub 500€ tier I can't find nothing good enought, or something without edge led.
The question must be: given this realistic budget, is better a 1080p with good contrast, or a 4k with low contrast and nit that make 8 bit hdr indistinguishable from sdr?
Ha, the 930D is hardly a High end model these days specifically due to its Edge lit nature. A Full Array local dimming set such as Sony's 940D, Z9D are vastly superior in producing a better HDR picture still. People also have to remember HDR's extreme peak brightness is a torture test to a TV's ability to reduce blooming and haloing, so an edge lit tv by nature is very limited to faithfully deliver a convincing HDR picture.
I can definitely understand why the quality or wow factor of HDR has been somewhat an ongoing argument and it is mainly due to the set that they've been viewing on. All I can say is that with the right kind of TV, HDR picture is vastly superior to a SDR picture could ever dreamed of. And for the best HDR experience currently available, you should be looking at this.
http://www.hdtvtest.co.uk/news/kd65zd9-201610164372.htm
My other option was a samsung EU40KU6000 that I discovered be too bad for sub 4k, and not that good for 4k.
(By yesterday evening I was competent in any 2015/2016 samsung's model, and some from lg and hisense )
For the sub 500€ tier I can't find nothing good enought, or something without edge led.
The question must be: given this realistic budget, is better a 1080p with good contrast, or a 4k with low contrast and nit that make 8 bit hdr indistinguishable from sdr?
Sure you can view it in the dark but that doesn't mean you're getting the intended minimum 1000 nits brightness that's mastered in all 4k HDR blu rays, Oled would clip highlights at 700 nits and over so you loose details. You also don't get that HDR trademark pop from the highlights so to me it's kinda pointless to go Oled if HDR expeience is what you're after.Given the choice I think I'd still go for an LG OLED. Most of my serious film watching experiences are done in the dark anyway. But as mrcorbo says, now's a good time to wait, as difficult as that may be. 1 more year, 2 at max....
The grail for UHD HDR experience is pretty much 4000 nits and 100% Rec 2020. No way in hell Oled could reach that brightness any time soon, they're still rocking on 600-700 nits. Sony's Z9D is half way there and its prototype has already reached that brightness. Both would take awhile to reach 100% Rec 2020 tho. Qled is promising by combining both the high brightness of Quantum Dot Led and the self immersive nature of Oled, but it's a new untested tech and could potentially bring forth a slew of issues, not to mention the premium price. At this rate it's gonna take at least 2-3 years to actually find an affordable and near grail quality HDR set. But when you do get one, the next best thing is only a year away anyway. So what are you going to do?IMHO spending that much money on a TV at this particular time would be foolish, though. There seems to be way too much movement in the tech at the moment. Because of this, I expect that the best HDR experience that's currently available will be a very average HDR experience in fairly short order and isn't worth paying a premium for. For me to pay a premium for an HDR TV, that HDR TV needs to deliver an HDR experience that's caveat-free and we're not there yet.
Can't that be fixed for practical intents and purposes by just reducing the multiplicative factor on the TV's brightness? Then you'd get less nits, but 700 nits in a dark room will still look very bright. Certainly brighter than 1000 nits in a bright room.Sure you can view it in the dark but that doesn't mean you're getting the intended minimum 1000 nits brightness that's mastered in all 4k HDR blu rays, Oled would clip highlights at 700 nits and over so you loose details.
Nope, HDR content automatically force maxing out the light output. DolbyVision is a turn around but there's no physical media for it yet. Also the thing is 700 nits is the peak brightness at a 2-5% "window" size, anything above 20% you'd start loose brightness which means on average Oled is quite a bit dimmer and reaches nowhere near its peak brightness. All that is not even taking into account of ABL "automatic Brightness Limiter" which annoyingly kicks in whenever a brightish picture shows up, you would see something like a 150 nits sky for example. And that's terrible for playing games where graphics whores like to gawk at them nicely rendered sky boxes, snow levels, stencil art etc. It's a no go mate.Can't that be fixed for practical intents and purposes by just reducing the multiplicative factor on the TV's brightness? Then you'd get less nits, but 700 nits in a dark room will still look very bright. Certainly brighter than 1000 nits in a bright room.
To the best of my knowledge it is the most qualified by raw specs. The latest HDR shootout for four Flagship models has awarded both the first and second place to FALD LED LCD.so the only "real HDR" available right now is an LCD with FALD?