Best 4K HDR TV's for One X, PS4 Pro [2017-2020]

Status
Not open for further replies.
If you have 10k LEDs, even if they're tiny itty bitty, they're likely going to draw several hundred watts going full blast. TV might even need active cooling, potentially...
 
Guess that answers the question once and for all. Yes we do need more nits. A lot more.
If the goal is to imitate reality, of course we do. Even more than 10k nits.

The important thing then is to properly capture reality with cameras and editing [or properly generate gameworlds so that we don't get blinded all the time].
 
We don't need to recreate reality. We don't need bright sunlight in dim living rooms. Typically when it's that bright outside (direct sunshine) we draw the curtains or wear sunglasses. Again, the pupil dilation thing - if ambient light is low, peak light needs to be lower to be relatively bright enough without being painful.

Set your phone to maximum brightness, sit in a darkened room with the phone off for a while, and then set it playing a desert movie. The brightness relative to your pupil size will be plenty bright enough at a few hundred nits.

The only time 10000 nits is needed on a display is when the ambient is bright enough that you need 10000 nits to be relatively bright enough. It's a lot cheaper and more environmentally friendly to just draw the curtains and dim the lights than crank up the display brightness.
 
We don't need to recreate reality. We don't need bright sunlight in dim living rooms. Typically when it's that bright outside (direct sunshine) we draw the curtains or wear sunglasses. Again, the pupil dilation thing - if ambient light is low, peak light needs to be lower to be relatively bright enough without being painful.

Set your phone to maximum brightness, sit in a darkened room with the phone off for a while, and then set it playing a desert movie. The brightness relative to your pupil size will be plenty bright enough at a few hundred nits.

The only time 10000 nits is needed on a display is when the ambient is bright enough that you need 10000 nits to be relatively bright enough. It's a lot cheaper and more environmentally friendly to just draw the curtains and dim the lights than crank up the display brightness.

But that’s really missing the point, and it’s understandable if you haven’t had much experience with HDR viewing or with trying to understand why it looks so much better. Which is also why it’s a bit of a hard sell to begin with.

I like to think of GOOD HDR as the way to get rid of a kind of uncanny valley in how we watch content on TVs.

Let me explain that. When we watch a movie for example, we know that what we’re seeing is ‘reality’. BUT! The TV only shows a representation of the reality that was shot with a camera. Colours aren’t really what they look like in reality, because in reality there’s a LOT more luminosity that has nothing to do with what we call brightness, but has more to do with how light behaves. In SDR or low brightness TVs, “Light” doesn’t looks like LIGHT. It looks like a white surface, and that’s not what reality looks like.

Now, try to think of HDR as a way to get rid of the limitations of the TV. A way to make the TV look more like a window into reality than a picture on the wall where light=white.

This has - again - nothing to do with blinding light, or looking directly into light sources, which will always look WAY brighter than 10,000 nits. You see the difference in every surface that bounces light. That’s where the magic happens. HDR makes the image REAL and 3D just from the surfaces reflecting or bouncing any kind of light: walls, metallic surfaces, any surface really. And as soon as you open this Pandora’s box, you can see exactly where the limitations are, as any bright part of the image that doesn’t feel right, doesn’t because there isn’t enough luminance (nits) and isn’t represented right.

That of course assumes that the content itself is mastered right in HDR (its not always a given).

So the WOW relating to this crazy 10,000 nits TV is due to the fact that finally a video of a car under the sun looks like A REAL CAR UNDER THE SUN and not a TV showing a car under the sun with colour and white surfaces trying to replicate the reflections and specular highlights.

This applies to ALL images, no need to have such a bright source of light as the sun- that was just an example. Everything in HDR elevates the picture from a ‘picture’ to reality.

I honestly have no more ways to explain it.
 
Last edited:
The thing with the above demo is perhaps sony has cheated with the comparison i.e. perhaps the 75" screen is not set up optimally, many cases of a company doing this before.
My gut feeling though is 10k is too much (though looking at your phone outdoors will finally be a nonissue), though I'ld have to see this IRL to actually tell, perhaps it will blow me away
 
"With the 10,000 cd/m2 bright LCD on the CES the highlights were so glistening that you had to look away in places - the dazzling effect of vehicles approaching the screen was similar to real life when driving on the highway."

https://www.heise.de/newsticker/meldung/Hingucker-Gleissend-helles-HDR-Display-3937562.html

Cameramen and directors have to invent a new visual language again because fast action sequences with many cuts and glare effects might seem more daunting than attractive.But that will not happen in films because they do not submit to technology. Otherwise we would not have 24fps, film grain or alienated images where the pure picture quality suffers. None of this is a necessity and no one creates storyboards according to brightness values and sets color shades in advance.
Looking at something bright like that for 15 minutes is different than being exposed to the whole for 3 hours. Especially since there is still the problem of a excess of blue.
 
Last edited:
But that’s really missing the point, and it’s understandable if you haven’t had much experience with HDR viewing or with trying to understand why it looks so much better. Which is also why it’s a bit of a hard sell to begin with.

This applies to ALL images, no need to have such a bright source of light as the sun- that was just an example. Everything in HDR elevates the picture from a ‘picture’ to reality.
That's an argument in favour of HDR and I'm not against that - I did not say SDR is good enough because it can be bright to the point of being painful. ;) Your point doesn't work as a justification for 10000 nits though IMO.

When you see a real car under the sun, you are typically standing in an environment with an ambient luminosity such that your pupil is already contracted and the amount of light entering your eye isn't painful. If you were to be in a black room and then suddenly open a window onto that bright, sunlit world, you'd be painfully dazzled and have to look away, and take measures to let your eyes adjust (squinting etc). If you were to wear sunglasses to reduce the total brightness and then open the window, it would look like real life only not unbearably bright.

If you were to take a 10000 nits TV in a dark room and switch it on to show a bright scene, people would need to wear sunglasses to be comfortable watching. If the viewers adjust to 10000 nits, it'll not appear any brighter than 2000 nits. If you start with a peak of 2000 nits in the first place, it'll look bright and not need the sunglasses. 120 Hz 2000 nits HDR will look more like real life than 24 Hz 10000 nits. The only thing 2000 nits can't achieve versus 10000 is the same level of transition from dark to light, but such a transition is unnatural in most case. Short of being able to accurately recreate the sensation of looking down a long, dark cave at the mouht opening to a tropical beech, sensible brightness caps offer everything needed for high quality, realistic TV without burning more power than is necessary to achieve that.
 
Shifty, you don’t know what 10,000 nits looks like. Enough of this extrapolation that it makes people need to wear sunglasses. It’s based on conjecture because you you don’t know what 10,000 nits looks like. You probably don’t know what even 2000 nits looks like. All you can do is believe the experts that have seen this new Sony monster and have already categorically said that it is not in fact “too bright” but instead said that it makes HDR content look much more “real”, for the reasons I tried to explain.

That’s all!
 
We can joke about it, because it is ridiculously bright when looking at the number. Remember that it’s not a linear curve: on my ZD9 the difference in actual brightness perceived in a test white window between 1000 and its top 1800/1900 nits is actually very slight, however that slight difference makes a world of difference in making HDR more spectacular. I tested it. Trust mama, I know what I’m talking about [emoji2]

And I was one of those people who thought OLED would be just fine. Until I saw a ZD9 and “got it”. To the point that I cancelled my OLED order and bought one. Which is also why it is still going to be the flagship Sony model in 2018, which is crazy for a 1.5 year old TV.
 
Last edited:
Like I said people had to look away in some scenes because it was too bright.

__
I have read that LG does not sell current (2018) panels to others and therefore everyone else is always a generation behind.

I don't know if its an insider info or guesses. The manufacturers always say something different but it was strange that in 2017 only LG could significantly mitigate the ABL and the other OLED TVs performed comparably to only slightly better than in 2016. Now Panasonic officially speaks of a reduced ABL behavior for their 2018 OLEDs.
 
Last edited:
Shifty, you don’t know what 10,000 nits looks like. Enough of this extrapolation that it makes people need to wear sunglasses. It’s based on conjecture because you you don’t know what 10,000 nits looks like. You probably don’t know what even 2000 nits looks like. All you can do is believe the experts that have seen this new Sony monster and have already categorically said that it is not in fact “too bright” but instead said that it makes HDR content look much more “real”, for the reasons I tried to explain.

That’s all!
10000 nits is approximately the brightness of a bright day and surfaces directly lit by sunlight. Everyone knows what it looks like by looking out the window (except those living in Birmingham). It's not peak brightness that matters, but peak contrast versus the lowest brightness to which your eye is acclimatised to.

And I didn't say it's too bright for use either - the viewer will adjust. I said it's wasteful of electricity. The same result, a maximum brightness relative to ambient light with a contrast that provides visual intensity, can be achieved with fewer nits and lower ambient light.

In basic terms, it's the problem of the crowded room. People start with quiet voices. Because of the ambient volume, people start talking louder to be heard over the top. Eventually everyone's shouting at the person standing right next to them because the ambient noise is so high because everyone's shouting trying to be heard. The end result is lots of noise and no more clarity. If you can provide a quiet room, the quiet voices are perfectly acceptable for conversation and the loud voices are unpleasant.
 
10000 nits is approximately the brightness of a bright day and surfaces directly lit by sunlight. Everyone knows what it looks like by looking out the window (except those living in Birmingham).
But is it? How do you know that sun reflected on a car surface in reality is 10,000 nits?
 
I wonder how cinemas will deal with the new HDR paragdim. Is there even technology to display it ?
I'd prefer 60fps mastered content before 10000 nits though...
Whether it is on oled or zd9, 24fps content is just horrible and need to die, hope everyone can agree on that
 
24p is a bit painful at times.

But to answer your questions, there was so called Dolby Cinemas in the US where you can watch HDR movies - a 'normal' cinema is not even HDR.
 
Status
Not open for further replies.
Back
Top