Best 4K HDR TV's for One X, PS4 Pro [2017-2020]

Status
Not open for further replies.
I know it's all jokes. But I think some people are actually worried. It's really simple: your house lights (not even the bright ones!) push out a LOT more 'nits' than a ZD9, or any TV ever will. Yet we don't spend our day or evenings with sunglasses in the house, complaining the lights are too bright :)
And that's before we talk about the sun shining, even indirectly, into your houses!
 
I know it's all jokes. But I think some people are actually worried. It's really simple: your house lights (not even the bright ones!) push out a LOT more 'nits' than a ZD9, or any TV ever will. Yet we don't spend our day or evenings with sunglasses in the house, complaining the lights are too bright :)
And that's before we talk about the sun shining, even indirectly, into your houses!
Well we arent staring at the lights directly for hours :p
We do at a TV though
 
You're assuming some of us turn on the house lights instead of keeping things nice and dark cave-like.
 
I know it's all jokes. But I think some people are actually worried. It's really simple: your house lights (not even the bright ones!) push out a LOT more 'nits' than a ZD9, or any TV ever will. Yet we don't spend our day or evenings with sunglasses in the house, complaining the lights are too bright :)
And that's before we talk about the sun shining, even indirectly, into your houses!
A bad comparison, you don't stare at the sun, and i disadvise you to stare directly into some house lights. I actually changed one because it was too bright & retina-searing.
I agree we have still room to go on tvs ^^
Imho, average light does not need to be bigger, 700 nits white screen can already be sometimes uncomfortable for me. However highlights could go much higher without discomfort, and that is where the magic happen
 
No HDMI 2.1 for LG OLED 2018 models, it's not unexpected... And, They're going back to the 2016 ways, IE a different SoC for B8 compared to C8 and above.

 
A bad comparison, you don't stare at the sun, and i disadvise you to stare directly into some house lights. I actually changed one because it was too bright & retina-searing.
I agree we have still room to go on tvs ^^
Imho, average light does not need to be bigger, 700 nits white screen can already be sometimes uncomfortable for me. However highlights could go much higher without discomfort, and that is where the magic happen
Well we arent staring at the lights directly for hours :p
We do at a TV though
You also don't stare at the TV outputting 2000-4000 nits for hours!
My point is that even without looking directly (of course we don't), the light reflected on surfaces, the ambient light that makes you see what's around the room, is brighter than what a TV is outputting! :cool:

This is very simple to prove too. Look how bright your room is when the lights are on, or worse when the sun is out. And look at how bright the room is when you watch TV in total darkness. Not even close. The room is dark, and you see the faint glow of the TV - even my ZD9 in HDR mode can't get the room as bright as lights/sun can.
 
This is very simple to prove too. Look how bright your room is when the lights are on, or worse when the sun is out. And look at how bright the room is when you watch TV in total darkness. Not even close. The room is dark, and you see the faint glow of the TV - even my ZD9 in HDR mode can't get the room as bright as lights/sun can.
Though true, there's a slight correction you need to make and that's how adapted to the brightness your eyes are. The pupil shrinks to keep out too much light. In a dark room, the pupil is wide open, at which point there's a lot more light entering the eye from a moderate light source versus a bright light source when the pupil is constricted. It's the sudden introduction of bright light sources that's painful. It's not damaging, I don't think, but it will be uncomfortable and there's only so much discomfort that makes sense before it goes beyond entertainment. Recreating the brightness of the desert on a TV is plain wrong if the ambient lighting is a dark living room and the transition isn't that of coming out of a dark cave into that brightness.

If TV manufacturers had any sense, they'd stick a little ambient light detector in there and cap the brightness at a multiple of that brightness ensuring it was never too bright while allowing for a bazillion nits for those viewing against the backdrop of a glass conservatory in Hawaii in the height of summer at midday.
 
The 77-inch C8 seems to be the most interesting one to me but with outdated and muddy 24fps movies will get very annoying. I finally want 48/60fps in movies.
 
When OLED comes down to 500 dollars I'll think about it. Until then, LCD is good enough.

And hey, maybe the doubled LCD layer announced awhile back that's effectively adding per pixel local dimming will be cheaper than OLED
 
Though true, there's a slight correction you need to make and that's how adapted to the brightness your eyes are. The pupil shrinks to keep out too much light. In a dark room, the pupil is wide open, at which point there's a lot more light entering the eye from a moderate light source versus a bright light source when the pupil is constricted. It's the sudden introduction of bright light sources that's painful. It's not damaging, I don't think, but it will be uncomfortable and there's only so much discomfort that makes sense before it goes beyond entertainment. Recreating the brightness of the desert on a TV is plain wrong if the ambient lighting is a dark living room and the transition isn't that of coming out of a dark cave into that brightness.

If TV manufacturers had any sense, they'd stick a little ambient light detector in there and cap the brightness at a multiple of that brightness ensuring it was never too bright while allowing for a bazillion nits for those viewing against the backdrop of a glass conservatory in Hawaii in the height of summer at midday.

Too bright light source in a dim room, like the TV suddenly changes from night outdoor to desert on noon on HDR with super high nits... Would result in random image flashing and or retention to the eyes for several seconds (sorry I don't know what the correct term for that).

You can try it yourself by closing your eyes and covering it with your palm for 30 seconds. Then open your eyes to the sun, or bright torch, or bright study/desk lamp.

About automatic adjustment based on ambient light. Dunno why but it seems it's a very hard thing to do. None of my screens (tablet, phone, TV) with ambient light sensors able to do it smoothly.

There's always something wrong with them (changes too quick, haphazardly goes dimmer or brighter, fucking up the colors-only on intel, etc), making the backlight changes noticeable.

Although if the changes are combined with HDR, I think it should work better.

Like.. The TV senses you are watching in pitch black. It also knows the scenes where the HDR wanna go to torch mode, so it cleverly caps the max HDR brightness. So the changes will be invisible to the viewers.
 
I imagine the fault with the brightness is they're trying to be too clever. The moment the screen goes bright, the ambient light will increase. If they respond to that, they'll be constantly changing. All they want is a starting brightness for when you're in a dark room, and then cap the brightness and leave it at that level. They can also average over time to account for changes (sun setting) and slowly change the cap. But there's no reason why a brightness sensor shouldn't work and allow healthy brightness levels in all conditions.
 
..
If TV manufacturers had any sense, they'd stick a little ambient light detector in there and cap the brightness at a multiple of that brightness ensuring it was never too bright while allowing for a bazillion nits for those viewing against the backdrop of a glass conservatory in Hawaii in the height of summer at midday.

This is also why people should calibrate their tvs.
 
If TV manufacturers had any sense, they'd stick a little ambient light detector in there
I should think most TVs already have these; mine certainly does and it is a bunch years old at this point.

When OLED comes down to 500 dollars I'll think about it. Until then, LCD is good enough.
LCD looks fking great these days, and undoubtedly will only get better with time. And OLED has that pixel wear issue too (although quantum dots also wear IIRC? Feh. Lol.)

And hey, maybe the doubled LCD layer announced awhile back that's effectively adding per pixel local dimming will be cheaper than OLED
I've thought of that approach in the past, maybe I should have patented it when I had the chance! ;) Would be a smart solution, if the stacking process doesn't greatly cut down on the amount of light that can pass through the panel of course.
 
I should think most TVs already have these; mine certainly does and it is a bunch years old at this point.

Does it works well? On my TV it's still too bright and the response is very very slow. Much better to just manually adjust it via backlight slider
 
Does it works well?
My TV is a Samsung (ridiculously long model name impossible to remember), and yes. It actually does... Turn off all lights and it dims the screen WAY down. It's decently responsive as well, like a few seconds of delay at most. Certainly way faster than grabbing the button-riddled overly complicated remote and trying to navigate Samsung's shitty convoluted on-screen menus to adjust brightness manually! :p
 
Sounds like adaptive brightness on smartphones. I hate that feature on phones and always turn it off, so I suppose I would on TV's. Plus it has tons of potential to not work correctly/have bugs.

The phone manufacturers want to prioritize battery life above all so IMO they always set the adaptive brightness too low inside or in a dim room. So my phone screen is way too dark for my liking on adaptive brightness. And I typically set the brightness manually just over half, so it's nothing crazy or anything, but it's still WAY brighter than adaptive brightness in 90% of scenarios.
 
Status
Not open for further replies.
Back
Top