Really really cool info KOF! Are you the AMD guy in the video?
I like to seek your expertise about this new standards....did the industry just came up with all this REC2020, 12bit PQ EOTF, literally..? I mean they are all just numbers, but behind them what we want to see are the displays being able to output the darkest blacks, the brightest whites and saturatedest hues. Why did it take them soooo long to come up with, what seems will be the biggest leap in motion pictures since....motion pictures itself? Is the technology just not feasible back when we were migrating to BR/FHD 10 years ago?
But i remember back then, we already have zonal backlit CCFL LCDs, my 5 years old laptop firepro can output 10-bit, we have plasmas that can do 0.001 cd/m2 and also output billions of colors, Sony has triluminious displays years back, heck some old Sony FHDTV have 10-bit panels....all these were available 5-10 years ago, but why did it us so long to move forward? Was it a breakthrough in some fields like QD that make these tech better? OLEDs still seem some years off..
How will these new displays measure up against the Kuro? Will 10-bit triluminious panels of yesteryears be able to display the same levels of colours fidelity as the new ones? What was the industry doing for the last decades? This new standards looks feasible back then, so why now?
Sorry, long post so I had to cut it in half and somehow I messed up
(continues)
I wouldn't truly call it immersive video without yet another crucial piece of puzzle: OLED. You simply can't have a real life production with an LCD, local dimming or not. I'll explain why.
Have you ever had a moment where you've heard a very fine produced music on a high end speaker, and when the guitarist is plucking strings, you go "I swear I felt the string being plucked!" ? I did and it was an amazing experience. Audio is already at the "Being there" level and on a very good system, you can feel the texture of instrument despite only using ears.
Same with our eyes. Our eyes are very complex pieces of optical machinery. There are two types of light-sensitive cells on the retina: cones and rods. Cones that are further divided by three types, each receiving Red, Green, and Blue, so it accepts color. Rods accept luma, so it accepts black and white images. However, there are 1.2 billion rods while there are only 6 million cones, meaning our eyes are much more sensitive to changes in luminance than in color, that's why chroma subsampling is an accepted practice in broadcast/Bluray. Sebbi prefers higher luminance bits for the very same reason.
Light is travelled to our eyes in this manner: light source -> surface of object -> observer. What we are finally seeing is reflected light of the object cast from the source. Any changes in these three will change the final output. Light is a very complex beast. Depending on surface and medium, it can reflect, refract, absorb, and etc. And surface matters a lot for these events as well. Is it uniform? Or rough? glossy? diffuse? Same with audio, our eyes can tell these various textured surfaces without even touching them with great precision.
What are two of the most famous devices for emulating human optical system? : camera and display. Camera acts as an observer in this case and accepts light (photons) during exposure. Photons are entered into CCD/CMOS sensor's photosites (cavity array) and are accessed of how many have fell into each cavity. If there are numerous photons collected in a cavity, it's recorded to have higher luminance intensity. If there are only few, then low luminance intensity is recorded. Between two pixels, if one has collected one photon, and the other has collected 1000 photons, then the dynamic range between two pixels is 1000:1.
For scenes with large incident light and uniform surface, number of photons on each pixels should be fairly uniform. However, for high variations in reflectivity, like rough rock, water wave, etc, the dynamic range between adjacent pixels will be much greater. And how much these dynamic range varies depends on reflecting surface, and there are enormous variation of it. This is how our 1.2 billion rods distinguish surface textures.
On the display side, OLEDs faithfully reproduce these variations with 0.00001 cd/m2 (last recorded black level) black level. Samsung's top of the line SUHD LCD, the JS9500 only has 0.05 cd/m2 (at 120 cd/m2 backlight luminance) which means an OLED TV has at least 5000 times precision when it comes to resolving luminance variation. However, with HDR contents, average luminance has moved from 100~120 cd/m2 to 400 cd/m2, and LCDs, being transmissive, has to settle with black level hit. So the Samsung JS9500's black level is now 0.11 cd/m2 at 400 cd/m2, and that now means 11000 times less precision than the OLED TV. This is why next to plasmas and OLEDs, LCDs look flat and has no depth because light is resolved in more flat manner than should be. I own a Panasonic plasma that has 0.001 cd/m2 black level and I'm still amazed with its life-like presentation. Despite only being 8bit, I still like it better than the top of line HDR LCDs from Samsung and Sony. Some very well made Bluray movies has really good dynamic range and on one movie, I was seeing windows and I was very surprised I could actually feel its Fresnel effect (reflection/refraction occuring at the same time) in a really convincing manner. I've never seen something like that on LCDs.
This behaviour is also why even local dimming LCDs are ineffective in bringing convincing life-like image. Local dimming can only help with painting black area black, but for lit zones, it's subject to panel's native contrast ratio and thus will still look flat, and this will get worse as brightness increases even further towards 1000 cd/m2. Which is too bad as real life lava not only has great luminance, but also high variations in reflectivity.
Pioneer was correct in saying black is a canvas for displays. Zero black level means no unwanted light is added to each pixels and higher black level will rob saturation. This was painfully evident on Samsung SUHD quantum dot LCDs as their 91% DCI coverage wasn't as saturated as I liked. The LG OLED on the other hand really brings saturation and transparency to the colors. The net result of OLED + 10bit HDR + P3 gamut + 4K? The most harmonized picture I've ever seen in my life. Every 8 million pixels composed of very colorful colors are moving smoothly! In one scene, there was three water fountains, one on the back, one in the middle, on in the front, all spouting water at different timing and I could actually feel their distance! No 3D glasses needed!