@KOF So AMD's choice of 10bit will probably be a good compromise for quite a while, and the P3 colour gamut, which I was not aware of, should be available on displays sooner than later. Thanks for the summary.
Oh yes! 10bit/1000 cd/m2/P3 is actually the baseline standard for UHD Bluray and HDMI 2.0a, so AMD is simply following the already established SMPTE standard which is a good thing.
Improvements to the P3 color will come in two forms. First is enlarging the color gamut from P3 to Rec.2020 which most people are familiar with. Still more room for improvement from the Rec.2020, and that's where color volume (AKA 3D color gamut) comes in.
Simply enlarging the color primaries at the color gamut level (2D) isn't enough. We also need to feed dynamic range/contrast ratio to truly make colors independent. Without 3D gamut, Colors veering towards very bright whites (which will be common in HDR) will get absorbed by that very white, so saturation is lost. The same is true for the black. HDR also improves black level (Current 8bit SDR source = 0.1 cd/m2 / 10bit HDR source = 0.01 cd/m2 / 12bit Dolby Vision = 0.005 cd/m2) so colors will get absorbed by black. Employing 3D color gamut will allow colors to keep their saturation from the blackest blacks and whitest whites. Currently, only Dolby Vision supports 3D gamut but HDR10 baseline standard will also get this starting HDMI 2.1, but obvious, for Rec.2020, 12bit color depth is more ideal than 10bit, because to get 100% 3D gamut coverage of Rec.2020, Dolby argues you really need 10000 cd/m2 of dynamic range which can only be provided by 12bit PQ EOTF.
Speaking of the PQ (Perceptual Quantizer) EOTF (Electro-Optical Transfer Function), this is perhaps the most important factor in making HDR a reality, because current CRT-based gamma (which is now more than 70 years old) is very inefficient in increasing dynamic range. 8bit 256 steps can only provide 1000:1 (0.1~100 cd/ms) dynamic range, and simply increasing color depth to 10bit (1024 steps) would have only increased dynamic range by 4 times, 12bit (4096) by another 4 times and so on. Think of 12bit PQ EOTF as a lossy compression which compresses 15~16bit, 20fstops of dynamic range down to 12bit. A Hollywood colorist named Joe Kane also argues you pretty much need 16bit color depth to use up entire Rec.2020, that's why 12bit PQ curve is pretty much required for full Rec.2020 coverage. If we have stayed with gamma (CRT), increasing data infrastructure up to 16bit would have been a colossal challenge.
AMD is going to officially support 10bit PQ EOTF so that's a good thing, but obviously, game creators now need to utilize PQ EOTF before we start seeing HDR games being released. We all know today's game engines have no problem handling dynamic range in excess of 20fstops, but deliberately had to tone map it down to 8bit standard dynamic range because the monitors the majority of artists currently use are...8bit SDR. I asked one console developer if it's possible to convert Xbox360/PS3 era games into HDR by intercepting tone mapping call and he said that would be extremely challenging as gamma and EOTF curve is simply not compatible. (The artists have already done the work in tone mapped/gamma based graphics)