Best 4K HDR TV's for One X, PS4 Pro [2017-2020]

Status
Not open for further replies.
If you are buying into 1080p, maybe if you can find a hell of a bargain on a used Plasma set. I highly doubt those exist as prices on current day to day 1080p sets are insanely cheap, so its almost not worth selling. I imagine most of the plasma sets are put to use in a second room or handed down to family/friends.
 
The thing is that my PC is only capable of 1080 (i've got a 1060 6GB that could get me to 4K but just lowering the settings a lot), so i worry it will look bad on a 4K TV.
1080p won't look bad on a 4K set. In fact it should look better at the same screen size than 1080p native. As others say though, probably worth saving your money because new tech like real HDR and decent quality 4K will be available in a couple of years.
 
I just played Andromeda at native 4k on my 1060 at High. Not 60 fps but not unplayable either. 1060 can do a lot, I would say.

That is actually very good. I will try using DSR to see where the card can go. Thanks.

1080p won't look bad on a 4K set. In fact it should look better at the same screen size than 1080p native. As others say though, probably worth saving your money because new tech like real HDR and decent quality 4K will be available in a couple of years.

The thing is, I may be able to get a TV (Plasma HDReady) from my brother as he is moving to another house and he may not need it, but otherwise I have no TV. I really was thinking on spending quite some money on an LG OLED, but after reading about HDMI 2.1 I really don't want to make that type of purchase now. Mmm... decisions. xD
 
Last edited by a moderator:
IMHO there's just no pressure to commit to a TV now. Even when the X1X is available waiting for HDMI2.1 products for such long term investments like a new AV/TV makes the most sense to me or buy a cheap 4K LCD if you really "believe" you "need" one. At least HDMI2.1 finally defines a standard which should hold for at least another decade while HDMI1.4/2.0 were just a short time fixes.
 
Last edited:
Those pics don't look different because of hdr. They look different because one badly needs to be calibrated, assuming the picture looks the way it does on your tv.


Neither set is calibrated they are both straight from box. also, difference between HDR/SDR is similar on the same TV (the HDR one) without changing any setting but HDR (so obviously same calibration settings I guess).
 
Yikes, such an eye opener to look at more than just peak brightness scores. Sets that behave like the Vizio are entirely backwards and broken from a HDR perspective.

The way that article describes HDR Content, what HDR content is and should be used for, also made a lot of sense without getting into technical terms.


Rtings 10% window brightness ratings should cover that no? Vizio does fine. https://www.google.com/search?q=viz...0j69i57j0l4.2583j0j7&sourceid=chrome&ie=UTF-8

The old 2016 P Series hit 487 cd/m^2 in 10% windows.

http://www.rtings.com/tv/reviews/vizio/m-series-2017

The 2017 Vizio M series can hit nearly 800 cd/m^2 in 10% window.

Perhaps the author tested an old firmware. On many set reviews you see Rtings note that local dimming actually dims small highlights, maybe this is the same as what the article is talking about. But it's hardly exclusive Vizio problem (if at all?) but across many brands (Samsung, Lg) with their lower end sets. I assume it's an algorithm issue not a hardware one and could be "fixed" with a FW update for sets that behave this way.
 
Perhaps the author tested an old firmware. On many set reviews you see Rtings note that local dimming actually dims small highlights, maybe this is the same as what the article is talking about. But it's hardly exclusive Vizio problem (if at all?) but across many brands (Samsung, Lg) with their lower end sets. I assume it's an algorithm issue not a hardware one and could be "fixed" with a FW update for sets that behave this way.

Nope. This is a function of the number of local dimming zones. Algorithms could yield some improvement, but nothing significant.
 
Just ordered the TCL 55P605 from Best Buy and it's scheduled to be available at my local store on Saturday. I'm going to move the Hisense 50H8C I'm currently using as a main TV to another room to replace a 42" Panasonic Viera Plasma.

I could have continued to live with the Hisense, but that TV didn't support WCG and I wanted to have that part of the HDR experience.

Come HDMI 2.1 (whenever there's a TV model that checks all of my boxes), I'll upgrade from the TCL TV in the main setup and also upgrade the Denon receiver to an HDMI 2.1 model and both the TCL and my existing Denon will move to the second setup.
 
For 10 bit, I think HDR is mastered at 64-940 and not 0-1023. So limited versus full RGB still apply.
This legacy is the new "interlaced" versus "progressive";
REC.2020 is really limited not even allows full range.

BT.2100 allows fullrange, but again the HDR curve salad means each HDR10, HLG or Dolby PQ have different choices on this.

Narrow range were used to calibrate noise at analog camera pipelines, with test signals " blacker than black" and superwhite;

Of course some type of test signal is needed for calibration!
My issue is why throw away so much codewords at this and not simply maintain the 32 codes from 8bit.
(Quantize linear sensor noise of course but is not usable range)

Then the PQ2084 would have had JND performance near 12bit narrow range at only 10bit "cost" In a couple years Dolby will "launch" 12 bit PQ to solve these issues, they've chosen to maintain...

Sent from my Moto G using Tapatalk
 
Dolby vision movies are mastered for 4000 nits peak, I think. Hdr10 movies can be mastered for 1000 or 4000.

For games, I'm not really sure how it would work.

At least gamma seems to be greatly simplified with hdr.
Vision is PQ2084 the max is 10k nits but very coarse. Only helps highlights.

HLG is 5000nits max but nicer curve on the shadows.


Sent from my Moto G using Tapatalk
 
Is anyone else worried by the fact that the most active thread by far in this forum on console gaming is about TVs? :-|
Maybe devs and gamers are trying to find the best hardware to calibrate for...

There is a history about how BT.1886 was standardized... IIRC

Famous director/color grading person had a professional CRT calibrated for the best filmic look used for many years in the industry with analog film production at a bunch of films "De facto standard" color grading.

Then a issue happened for the premiere of a film, using a new Digital Projector assuming sRGB 2.2 gamma. Many days of discussion and trying to please the creator at the venue and finally the solution was measure the director CRT.

So the device was actually gamma 2.37, then easy fix at the venue and more measurements later BT.1886 is now the "standard ".

So in a way sRGB needs revision too...

Maybe some studio/dev with a color grade chamber could test if fullrange mode active at tv/console with the now common in game brightness pattern calibration; has better detail resolve in shadows/highlights.

Sent from my Moto G using Tapatalk
 
Vision is PQ2084 the max is 10k nits but very coarse. Only helps highlights.

HLG is 5000nits max but nicer curve on the shadows.


Sent from my Moto G using Tapatalk


I read something like movies were being mastered for 4k nits in Dolby Vision, and 1k nits for HDR10, but both allowed for screens up to 10k nits. I'm not sure what that means in practice. Future room for growth as tv capabilities improve?
 
I mean I'm no doctor, but being my 350 nit screen looks very bright to me (brighter than my old 250 nit SDR of course), 1k must be crazy, 4k like, must literally threaten vision I would think, heh. 10k? Are we going to be blinding people when the sun comes onscreen?

Since few HDR TV's seem to even hit 1K currently seems we'd have a very long way to go. Not sure it would be worth the presumably extreme expense of pumping out laser-esque light beams from small clusters of LED's, either
 
Last edited:
Nope. This is a function of the number of local dimming zones. Algorithms could yield some improvement, but nothing significant.


Some very simple math says a 55" TV with 32 local dimming zones could have a zone for each ~38 sq inches of area. So about a 6" X 6" patch. Seems like a pretty small area compared to the van in a tunnel example in the original article. 72 zones on the TCL would be even much better.
 
I mean I'm no doctor, but being my 350 nit screen looks very bright to me (brighter than my old 250 nit SDR of course), 1k must be crazy, 4k like, must literally threaten vision I would think, heh. 10k? Are we going to be blinding people when the sun comes onscreen?

Since few HDR TV's seem to even hit 1K currently seems we'd have a very long way to go. Not sure it would be worth the presumably extreme expense of pumping out laser-esque light beams from small clusters of LED's, either
It's really not that bad. 1800 nits can be quite dazzling but in no way what I'd call damaging. In fact I'm pretty sure it doesn't increase in a straight line, but logarithmic or whatever. I think. So 4000 nits is not 10 times brighter than 400.
 
Last edited:
I read something like movies were being mastered for 4k nits in Dolby Vision, and 1k nits for HDR10, but both allowed for screens up to 10k nits. I'm not sure what that means in practice. Future room for growth as tv capabilities improve?
Teorically, but other aspects are not future proof. Like the change from 10 to 12 bit.
The tradeoffs are not clear for the early adopter.
Most likely UHD Premium "2" will change everything and provided the short window of firmware support for the TVs.

The effective long term support is slim. As an example "x.v.Color and 10,12,16 bit HDMI formats are badly supported in the desktop. Even PRO cards there is not a battle tested environment of tools. That allow maximize today displays

A possible solution would be enable HDMI 1.3 features and expand the Dolby vision /HDR10 bit for this linear 16/12 with a LUT to provide compatibility for 1080p and early 4k displays.


Sent from my Moto G using Tapatalk
 
Some very simple math says a 55" TV with 32 local dimming zones could have a zone for each ~38 sq inches of area. So about a 6" X 6" patch. Seems like a pretty small area compared to the van in a tunnel example in the original article. 72 zones on the TCL would be even much better.

But within each dimming zone there only needs to be one high-contrast feature to break the TVs ability to represent it properly. You either try to represent the highlight properly by cranking up the backlight in that zone, raising the brightness of any darker features there, or darken the backlight in that zone to show the darker feature properly, which would dim the highlight, or some of both.
 
I mean I'm no doctor, but being my 350 nit screen looks very bright to me (brighter than my old 250 nit SDR of course), 1k must be crazy, 4k like, must literally threaten vision I would think, heh. 10k? Are we going to be blinding people when the sun comes onscreen?

Since few HDR TV's seem to even hit 1K currently seems we'd have a very long way to go. Not sure it would be worth the presumably extreme expense of pumping out laser-esque light beams from small clusters of LED's, either

Your eyes adjust, just as they do to natural light. Bias lighting behind the screen helps.
 
I mean I'm no doctor, but being my 350 nit screen looks very bright to me (brighter than my old 250 nit SDR of course), 1k must be crazy, 4k like, must literally threaten vision I would think, heh. 10k? Are we going to be blinding people when the sun comes onscreen?

Since few HDR TV's seem to even hit 1K currently seems we'd have a very long way to go. Not sure it would be worth the presumably extreme expense of pumping out laser-esque light beams from small clusters of LED's, either
agree there..i was thinking the same. This ku7000 tires me much faster than my previous monitor and my wife keeps telling me to turn down the brightness as her eyes start hurting soon. I wonder how ppl with 1k nits play !!?

Sent from my SM-N920G using Tapatalk
 
Status
Not open for further replies.
Back
Top