Haha it's not that drastic as you think, your eyes would adjust to the brightness in no time trust me.Yeah, blast those friggin' retinas into orbit! Who needs 'em, right?!
Well we arent staring at the lights directly for hoursI know it's all jokes. But I think some people are actually worried. It's really simple: your house lights (not even the bright ones!) push out a LOT more 'nits' than a ZD9, or any TV ever will. Yet we don't spend our day or evenings with sunglasses in the house, complaining the lights are too bright
And that's before we talk about the sun shining, even indirectly, into your houses!
A bad comparison, you don't stare at the sun, and i disadvise you to stare directly into some house lights. I actually changed one because it was too bright & retina-searing.I know it's all jokes. But I think some people are actually worried. It's really simple: your house lights (not even the bright ones!) push out a LOT more 'nits' than a ZD9, or any TV ever will. Yet we don't spend our day or evenings with sunglasses in the house, complaining the lights are too bright
And that's before we talk about the sun shining, even indirectly, into your houses!
A bad comparison, you don't stare at the sun, and i disadvise you to stare directly into some house lights. I actually changed one because it was too bright & retina-searing.
I agree we have still room to go on tvs ^^
Imho, average light does not need to be bigger, 700 nits white screen can already be sometimes uncomfortable for me. However highlights could go much higher without discomfort, and that is where the magic happen
You also don't stare at the TV outputting 2000-4000 nits for hours!Well we arent staring at the lights directly for hours
We do at a TV though
Though true, there's a slight correction you need to make and that's how adapted to the brightness your eyes are. The pupil shrinks to keep out too much light. In a dark room, the pupil is wide open, at which point there's a lot more light entering the eye from a moderate light source versus a bright light source when the pupil is constricted. It's the sudden introduction of bright light sources that's painful. It's not damaging, I don't think, but it will be uncomfortable and there's only so much discomfort that makes sense before it goes beyond entertainment. Recreating the brightness of the desert on a TV is plain wrong if the ambient lighting is a dark living room and the transition isn't that of coming out of a dark cave into that brightness.This is very simple to prove too. Look how bright your room is when the lights are on, or worse when the sun is out. And look at how bright the room is when you watch TV in total darkness. Not even close. The room is dark, and you see the faint glow of the TV - even my ZD9 in HDR mode can't get the room as bright as lights/sun can.
Though true, there's a slight correction you need to make and that's how adapted to the brightness your eyes are. The pupil shrinks to keep out too much light. In a dark room, the pupil is wide open, at which point there's a lot more light entering the eye from a moderate light source versus a bright light source when the pupil is constricted. It's the sudden introduction of bright light sources that's painful. It's not damaging, I don't think, but it will be uncomfortable and there's only so much discomfort that makes sense before it goes beyond entertainment. Recreating the brightness of the desert on a TV is plain wrong if the ambient lighting is a dark living room and the transition isn't that of coming out of a dark cave into that brightness.
If TV manufacturers had any sense, they'd stick a little ambient light detector in there and cap the brightness at a multiple of that brightness ensuring it was never too bright while allowing for a bazillion nits for those viewing against the backdrop of a glass conservatory in Hawaii in the height of summer at midday.
..
If TV manufacturers had any sense, they'd stick a little ambient light detector in there and cap the brightness at a multiple of that brightness ensuring it was never too bright while allowing for a bazillion nits for those viewing against the backdrop of a glass conservatory in Hawaii in the height of summer at midday.
I should think most TVs already have these; mine certainly does and it is a bunch years old at this point.If TV manufacturers had any sense, they'd stick a little ambient light detector in there
LCD looks fking great these days, and undoubtedly will only get better with time. And OLED has that pixel wear issue too (although quantum dots also wear IIRC? Feh. Lol.)When OLED comes down to 500 dollars I'll think about it. Until then, LCD is good enough.
I've thought of that approach in the past, maybe I should have patented it when I had the chance! Would be a smart solution, if the stacking process doesn't greatly cut down on the amount of light that can pass through the panel of course.And hey, maybe the doubled LCD layer announced awhile back that's effectively adding per pixel local dimming will be cheaper than OLED
Then there should be no worry about a high maximum brightness.I should think most TVs already have these; mine certainly does and it is a bunch years old at this point.
I should think most TVs already have these; mine certainly does and it is a bunch years old at this point.
My TV is a Samsung (ridiculously long model name impossible to remember), and yes. It actually does... Turn off all lights and it dims the screen WAY down. It's decently responsive as well, like a few seconds of delay at most. Certainly way faster than grabbing the button-riddled overly complicated remote and trying to navigate Samsung's shitty convoluted on-screen menus to adjust brightness manually!Does it works well?