HDR has clawed back APL to right back to where they were. Back in the SDR days, we used gamma, a relative luminance transfer function to display an image. Gamma values were usually set at 2.2~2.4, and if set to lower value, (higher gamma) it would’ve blown out highlight detail, if set to higher value, (lower gamma) it would’ve blown out shadow details. What’s truly important about gamma is that because it’s a relative luminance system, you can set pretty much any brightness value to any display systems. Nominal value is 100 nits as per SDR dynamic range standard, but can be set to anywhere between 200 and more for LCDs, around 50~120 for plasmas, or even lower for some projectors. While a spectacular standard for flexibility, it did have a caveat. For example, a 80 nits plasma that can’t hit nominal value of 100 nits would lose 20% of SDR dynamic range. The reverse was also true. LCDs which puts out 200 nits would also lose out 50% of dynamic range, but for a different reason. Dynamic rage would start to be compressed once over 100 nits, similiar to “Loudness Wars”. So playing 400 nits would have meant compressing the dynamic range to 25%. You can test this easily with any LCDs and can tell night scenes gradually looking more like daylight scenes. BTW, Nominal SDR dynamic range required a full field white screen luminance of 100 nits, and many LCDs were simply able to keep such high APL luminances at higher brightness. This was when “Plasmas/OLEDs have too strong ABL” meme has started. It wasn’t really their fault as OLEDs especially could also keep 100 nits at 100% APL, and that’s more then enough for reference SDR luminance, but because people were so used to LCDs having much more relaxed ABL, people started complaining, not realizing what they were doing to video was equivalent to music companies compressing music like hell.
Today’s HDR standard is completely opposite. Instead of gamma, it uses PQ EOTF and it operates on absolute luminance transfer function. Luminance can no longer be dynamically adjusted as was possible with SDR. Luminance values are fixed at every greyscale value. This has created a serious problem as any displays unable to display higher HDR values are screwed. They either had to use tone mapping to compress dynamic range, but at the expense of lowering APL or simply used clipping to discard higher luminance values, like the Sony Z9D. For the Sony Z9D, Sony has simply hated the idea of having to use tone mapping to reduce APL to reduce the HDR picture quality, so they chose to clip any values higher then 1500 nits and roll off after that up to 1800 nits. So think of tone mapping as dynamic range compression like LCDs blaring SDR contents at 200~400 nits, but without benefit of actually being able to such compressed video at as many luminance as that could satisfy you, HDR values are also dimmer, much dimmer.
Before with SDR, LCD’s could enjoy 400 nits of white luminance full field, but with HDR, that has decreased to 100~220 nits, and with tone mapping applied, that value drops down even further making it very difficult to watch some movies with lights on. This was done on purpose. If HDR APL has remained at a level LCD owners used to enjoy, many cinephiles would have up and armed crying HDR is not respecting the creator’s intent. So the average APL has increased over SDR which used to be 100 nits, but not by much. For HDR movies, many studios prefer to put nominal white value of 100, equal to SDR, and any increases in APL would be with the addition of specular highlights. THX standard recommends 160 nits value, Sony OLED BVM X300 which was used as a reference monitor to create many of HDR movies like Kingsman also had maximum full field white luminance of 160 nits. Consumer OLEDs like Panasonic GZ2000 can also maintain 160 nits. Other standards think it can get slightly more and argues 200 nits is the maximum sweet spot, but a few movies ignore such standards and hit 220 nits. For this reason, I think 200 nits ought to be enough if my goal is to go with the reference standards, and future OLEDs can easily get 200 nits in not distant future.
Tone mapping is a bigger fish to fry though, and without dealing with it, that 200 nits current LCDs and future OLEDs obtain will be for naught. As of now, static metadata is the most popular form of tone mapping. Dynamic tone mapping provides different tone mapping value to satisfy all users. Want to maintain APL? You can do that at the expense of highlight details. Want to maintain highlight details instead? You can do that at the expense of APL. It’s adjustable, but still not ideal.
Thus, better way to deal with it is dynamic metadata like Dolby Vision, which provides metadata at scene by scene. This is still not ideal as not all Dolby Vision performance is equal. Even among OLEDs, we have hardware-led (LG, Panasonic) vs player-led (Sony), and even with Dolby Vision, there’s still no avoiding APL reduction. And Dolby Vision is not really important for games either as it costs loyalty fees. Yes, Dolby Vision games like Mass Effect Andromeda were lovely on Geforce 1080 Ti, but you can’t have it anymore starting with Turing.
Some devices started to appear to remedy this problem. Select Panasonic UHD BDP has a feature called “HDR Optimizer” which can maintain as much as highlight details as possible but with little to no APL drop. It still operates under static metadata though. Much better, but much more expensive solution is Lumagen Radiance Pro, which can tone map not by scene by scene based like Dolby Vision, but frame by frame based. It can analyse each scenes and can give optimal tone mapping. It’s quite pricy at around $4500 though. MadVR also has a frame by frame tone mapping solution called MadVR Envy which uses Turing’s tensor units (the unit itself contains a PC) to perform machine learning to provide every highlight detail without any APL drop, but comes at even more crazy price of $5500 for a budget model, and $9999 for a flagship model. Yes, $10000 just to perform shit which was free back in the SDR days. PC peasants (hehe) who can’t afford such expensive toys can stay with free ones, but PC is also PITA when it comes to dealing with HDCP 2.1.
HLG is much more sensible way of dealing things. It is a combination of gamma and EOTF, and provides best of both. You can once again set any luminance level you want. Nominal luminance is once again, pegged at 1000 nits but you can go lower, or go up as high as 5500 nits. Because EOTF is also applied, there is no dynamic range compression as you go over 1000 nits, like SDR gamma. It’s a nice way to keep backwards compatibility with SDR while providing a flexible luminance choice with HDR too. Well done, BBC & NHK!
HDR scene as of right now is so f’ed up because Dolby has deliberately planned for this. They weren’t simply giving away HDR10 which they thought was rightfully theirs, hence this mess. Some studios are fighting over this and Fox, one of arch-enemy of Dolby Vision simply refuse to put out their movies in more then 1000 nits. Samsung’s HDR10+ was supposed to be the fighting standard for Dolby Vision, but it simply wasn’t meant be as it loses its dynamic tone mapping effectiveness for displays able to render more then 500 nits, making it more useful for their budget LCDs, their Micro LEDs, their QD OLEDs which will be around 1/3 dimmer then LG (the prototype unit can only hit 100 nits full field white and it gets very hot doing just that)