Best 4K HDR TV's for One X, PS4 Pro [2017-2020]

Status
Not open for further replies.
Just give me something with OLED style true blacks without all the limitations (mostly burn in but also fullscreen ABL) please! (and do it at a mass market price point)

I mean dang with all the science going on these days it's pretty amazing this isn't solvable.
 
LCDs have ABL as well. You wont get fullscreen maximum brightness on a TV. The peak brightness values are often taken from a 10 % window. With 100 % you would have much lower numbers than the one you always hear.

LG put a lot of effort into reducing burn in and the results looking good so far.
 
Last edited:
HDR has clawed back APL to right back to where they were. Back in the SDR days, we used gamma, a relative luminance transfer function to display an image. Gamma values were usually set at 2.2~2.4, and if set to lower value, (higher gamma) it would’ve blown out highlight detail, if set to higher value, (lower gamma) it would’ve blown out shadow details. What’s truly important about gamma is that because it’s a relative luminance system, you can set pretty much any brightness value to any display systems. Nominal value is 100 nits as per SDR dynamic range standard, but can be set to anywhere between 200 and more for LCDs, around 50~120 for plasmas, or even lower for some projectors. While a spectacular standard for flexibility, it did have a caveat. For example, a 80 nits plasma that can’t hit nominal value of 100 nits would lose 20% of SDR dynamic range. The reverse was also true. LCDs which puts out 200 nits would also lose out 50% of dynamic range, but for a different reason. Dynamic rage would start to be compressed once over 100 nits, similiar to “Loudness Wars”. So playing 400 nits would have meant compressing the dynamic range to 25%. You can test this easily with any LCDs and can tell night scenes gradually looking more like daylight scenes. BTW, Nominal SDR dynamic range required a full field white screen luminance of 100 nits, and many LCDs were simply able to keep such high APL luminances at higher brightness. This was when “Plasmas/OLEDs have too strong ABL” meme has started. It wasn’t really their fault as OLEDs especially could also keep 100 nits at 100% APL, and that’s more then enough for reference SDR luminance, but because people were so used to LCDs having much more relaxed ABL, people started complaining, not realizing what they were doing to video was equivalent to music companies compressing music like hell.

Today’s HDR standard is completely opposite. Instead of gamma, it uses PQ EOTF and it operates on absolute luminance transfer function. Luminance can no longer be dynamically adjusted as was possible with SDR. Luminance values are fixed at every greyscale value. This has created a serious problem as any displays unable to display higher HDR values are screwed. They either had to use tone mapping to compress dynamic range, but at the expense of lowering APL or simply used clipping to discard higher luminance values, like the Sony Z9D. For the Sony Z9D, Sony has simply hated the idea of having to use tone mapping to reduce APL to reduce the HDR picture quality, so they chose to clip any values higher then 1500 nits and roll off after that up to 1800 nits. So think of tone mapping as dynamic range compression like LCDs blaring SDR contents at 200~400 nits, but without benefit of actually being able to such compressed video at as many luminance as that could satisfy you, HDR values are also dimmer, much dimmer.

Before with SDR, LCD’s could enjoy 400 nits of white luminance full field, but with HDR, that has decreased to 100~220 nits, and with tone mapping applied, that value drops down even further making it very difficult to watch some movies with lights on. This was done on purpose. If HDR APL has remained at a level LCD owners used to enjoy, many cinephiles would have up and armed crying HDR is not respecting the creator’s intent. So the average APL has increased over SDR which used to be 100 nits, but not by much. For HDR movies, many studios prefer to put nominal white value of 100, equal to SDR, and any increases in APL would be with the addition of specular highlights. THX standard recommends 160 nits value, Sony OLED BVM X300 which was used as a reference monitor to create many of HDR movies like Kingsman also had maximum full field white luminance of 160 nits. Consumer OLEDs like Panasonic GZ2000 can also maintain 160 nits. Other standards think it can get slightly more and argues 200 nits is the maximum sweet spot, but a few movies ignore such standards and hit 220 nits. For this reason, I think 200 nits ought to be enough if my goal is to go with the reference standards, and future OLEDs can easily get 200 nits in not distant future.

Tone mapping is a bigger fish to fry though, and without dealing with it, that 200 nits current LCDs and future OLEDs obtain will be for naught. As of now, static metadata is the most popular form of tone mapping. Dynamic tone mapping provides different tone mapping value to satisfy all users. Want to maintain APL? You can do that at the expense of highlight details. Want to maintain highlight details instead? You can do that at the expense of APL. It’s adjustable, but still not ideal.

Thus, better way to deal with it is dynamic metadata like Dolby Vision, which provides metadata at scene by scene. This is still not ideal as not all Dolby Vision performance is equal. Even among OLEDs, we have hardware-led (LG, Panasonic) vs player-led (Sony), and even with Dolby Vision, there’s still no avoiding APL reduction. And Dolby Vision is not really important for games either as it costs loyalty fees. Yes, Dolby Vision games like Mass Effect Andromeda were lovely on Geforce 1080 Ti, but you can’t have it anymore starting with Turing.

Some devices started to appear to remedy this problem. Select Panasonic UHD BDP has a feature called “HDR Optimizer” which can maintain as much as highlight details as possible but with little to no APL drop. It still operates under static metadata though. Much better, but much more expensive solution is Lumagen Radiance Pro, which can tone map not by scene by scene based like Dolby Vision, but frame by frame based. It can analyse each scenes and can give optimal tone mapping. It’s quite pricy at around $4500 though. MadVR also has a frame by frame tone mapping solution called MadVR Envy which uses Turing’s tensor units (the unit itself contains a PC) to perform machine learning to provide every highlight detail without any APL drop, but comes at even more crazy price of $5500 for a budget model, and $9999 for a flagship model. Yes, $10000 just to perform shit which was free back in the SDR days. PC peasants (hehe) who can’t afford such expensive toys can stay with free ones, but PC is also PITA when it comes to dealing with HDCP 2.1.

HLG is much more sensible way of dealing things. It is a combination of gamma and EOTF, and provides best of both. You can once again set any luminance level you want. Nominal luminance is once again, pegged at 1000 nits but you can go lower, or go up as high as 5500 nits. Because EOTF is also applied, there is no dynamic range compression as you go over 1000 nits, like SDR gamma. It’s a nice way to keep backwards compatibility with SDR while providing a flexible luminance choice with HDR too. Well done, BBC & NHK!

HDR scene as of right now is so f’ed up because Dolby has deliberately planned for this. They weren’t simply giving away HDR10 which they thought was rightfully theirs, hence this mess. Some studios are fighting over this and Fox, one of arch-enemy of Dolby Vision simply refuse to put out their movies in more then 1000 nits. Samsung’s HDR10+ was supposed to be the fighting standard for Dolby Vision, but it simply wasn’t meant be as it loses its dynamic tone mapping effectiveness for displays able to render more then 500 nits, making it more useful for their budget LCDs, their Micro LEDs, their QD OLEDs which will be around 1/3 dimmer then LG (the prototype unit can only hit 100 nits full field white and it gets very hot doing just that)
 
It sucks being trapped into Dolby’s ecosystem that’s for sure. HDR10 can go up to 10000 nits as per PQ EOTF standard, but realistic maximum is 4000 nits as that’s when banding starts to appear and 10 bit will start to feel inadequate. If HDR10 goes up to 4000 nits, Dolby will simply move up to 12 bit 10000 nits, it’s a losing game. Dolby Vision also has an native advantage of being able to operate in native CMU format, which most movie studios use to master their contents. A few movies have wrong black level in HDR10 but does not in Dolby Vision because corrections were provided automatically. LCDs will die off as premium products in near future, and none of the self illuminating techs like OLEDs, micro LEDs, QNED have any short term roadmap to hit 4000 nits neither. At least game industry can avoid Dolby’s wrath as GPU venders like AMD/nVidia can put frame by frame tone mapping to their FreeSync HDR/GSync HDR future standards with no latency. Current MadVR solution is unfit as it still has latency doing machine learning based tone mapping. Console makers also have a standard called HGiG mode which came from a standard body called HDR Gaming Interest Group that includes Sony and MS and is supported on select PS4/Pro games but it’s still based on static metadata and gamers are already complaining their games being dimmer then their display’s own tone mapping.
 
LCDs have ABL as well. You wont get fullscreen maximum brightness on a TV. The peak brightness values are often taken from a 10 % window. With 100 % you would have much lower numbers than the one you always hear.

Maybe on FALD sets, standard LCDs don't have ABL.
 
I find only reviews for Sony, LG and Samsung and Rtings misses a lot of brands.
Some brands like Hitachi are much cheaper but I am not sure how they fair in terms of performance. Is there any other comprehensive website?
 
I find only reviews for Sony, LG and Samsung and Rtings misses a lot of brands.
Some brands like Hitachi are much cheaper but I am not sure how they fair in terms of performance. Is there any other comprehensive website?

Probably look at enthusiast forums, avs forums (us centric) and av forums (UK centric). There are probably others for mainland Europe but I don't know any.
 
Considering RTings is already spreading too thin, impacting their already mediocre review quality, I hope not. I do feel more wanting with today’s reviews compared to before, as HDTV reviews only needed to think about SDR, but with UHD TVs, there are so many factors to be considered like HDR peak luminance, tone mapping curves, dynamic tone mapping, dynamic metadata so on and on.

I do wish Panasonic comes back to North American and Australian market soon once LG Display moves full swing with their latest 10.5G technology in 2024. Current 8.5G wafer is only efficient for 55 inches, somewhat inefficient with 65 inches, very inefficient with 77 inches, slightly inefficient with 88 inches, efficient with 98 inches. (future models incoming) Future 10.5G wafer will be efficient with 48 inches, 65 inches, 75 inches, and 105 inches, which will make 75 inches affordable which Panasonic can take to make cheaper premium OLED in big size screens Americans love. More American reviews for Panasonic TV are always a good thing.
 
So it's probably gonna be 999 after CX2 is released. Nice.

I hope it sells enough so 48" gets a permanent slot. I cannot fit larger than a 49" TV so regardless of desire the high end sets have always been out of reach. I doubt I am the only one who needs or wants smaller but fully featured sets.
 
I hope it sells enough so 48" gets a permanent slot. I cannot fit larger than a 49" TV so regardless of desire the high end sets have always been out of reach. I doubt I am the only one who needs or wants smaller but fully featured sets.


Yep, pretty much in the same spot

I paid 1100€ back in 2007 for a 26´Panasonic tv, it was really nice.
It´s still up and running, now as a stop gap I grabbed a cheap 4k Samsung tv waiting for this size to happen.
 
So it's probably gonna be 999 after CX2 is released. Nice.
CX is actually read as C10.
They've just added another size to the CX range.
So the follow up will be in a years time. Discounts on black Friday i guess though.

I've been wanting a tv (oled) under 55" for a long time, so this is perfect. And i expect it will be for many people who is either space limited or just want something on the smaller size.

People lable it as gaming tv but the whole range is really, LG has been doing well supporting the hdmi feature set.
 
After missing out last week Amazon had the LG C9 in stock again for prices comparable to the EU so I pulled the trigger and ordered the 55". Kind of wanted the 65" but at 1000 euros more its just too much. Its arriving tomorrow so really excited. Also got a 2070 super the other day so it all should be a bit of an upgrade over my 7 year old 42" LG.

Only downside is that with most of the shops closed I haven't been able to buy a new tv stand yet so the new tv will be on the old crappy one I bought ages ago which sucks. Its the last thing we still have to upgrade after moving.
 
The CX models are starting to ship now I believe.

Still nothing in Japan. For whats its worth, I don't believe there are any big differences between the c9 and cx. One of the main reasons I went with the c9 and didn't wait for the cx (apart from budget) is that the cx can't passthrough HD audio from internal apps to EARC either and with the rest of the hardware being pretty much the same, I went with the cheaper option. If you don't need/want a tv in the next couple of months I'd say wait for the c11 and hope LG solves that stupid design limitation.
 
Status
Not open for further replies.
Back
Top