HDR revisited

Arwin

Now Officially a Top 10 Poster
Moderator
Legend
HDR is back in a big way. Lots of stuff is coming out that makes a big deal out of it (including the consoles now). But what does it really mean?

Personally, I don't quite understand why TVs should know anything more than to be able to understand a 12bit per color video source and map that to their full contrast range. Everything else that gives hints about the source material's 'HDR' surely is just adding information back that has at some point gone missing in the source material? So I could understand if you add HDR information for a certain movie that was encoded with 8bit precistion, but otherwise I don't see any advantages. Meaning that if a GPU can do 10-12 bit color and output that, games should be fine and do what they've always done?

Can anyone 'enlighten' me?
 
There are 2 components that are coming with UHD/HDR at once: color space and dynamic range.

Color space defines what the "most red" color that can be displayed is. Most displays have historically been Rec 709/sRGB while UHD is Rec 2020. The color space between them is drastically different. If you take an image from one color space to another it'll look completely wrong as colors shift around.
https://en.wikipedia.org/wiki/Rec._709
https://en.wikipedia.org/wiki/Rec._2020

Dynamic range is based on the EOTF (Electro-Optical Transfer Function) that maps digital code value to displayed luminance. This is what translates what a digital 0/128/255 means into physical brightness. The current EOTF, and the concept of "gamma", was standardized based on the physics of CRTs. If you simply extend the existing SDR gamma from 8 to 10 bits you go from 6 to 10 stops of dynamic range. Using the new Hybrid Log-Gamma transfer function the same 10 bits of HDR gives 17.6 stops of dynamic range.
https://en.wikipedia.org/wiki/Hybrid_Log-Gamma

This presentation has some more details on the guts if you're interested:
https://www.smpte.org/sites/default/files/2014-05-06-EOTF-Miller-1-2-handout.pdf
 
Here's another really good presentation about HDR, gamma, and tone mapping in video games from 2010 about Uncharted 2. It's a long presentation, but packed with good information.
http://filmicgames.com/Downloads/GDC_2010/Uncharted2-Hdr-Lighting.pptx

The first section does a good job of walking through the need for gamma in displays (vastly better contrast) and how games need to compensate for it. Slide 33 is a really good summary of continuously applying gamma adjustments so that the final image produced is correct.
The second section walks through the process of tone mapping, which is converting an higher dynamic range image (most engines use fp16 internally, which has 30 stops of dynamic range) into a lower-dynamic range image (standard SDR is 6 stops) without ugly clamping or highlights blowing out.

For SDR displays the process is tonemap to reduce the dynamic range and then gamma correction so things looks correct.

For HDR displays both of these steps needs to be rewritten:
- HDR's dynamic range of at least 17 stops (for 10-bit hybrid log, 12-bit PQ is 20 stops) is wide enough that only minimal tone mapping, if any, is required
- gamma has been replaced with hybrid-log gamma, so all the gamma calculations need to be completely redone

For computers / consoles there's the obvious question of why can't they just output their internal 16-bit linear-space floating point value and not have to deal with any of this going forward. Page 10 and 14 of the pdf I link above show why - using either log (floating point) or gamma based representation results in a wasting a ton of bits in either the shadows or highlights. That would require drastically more data to be transferred and processed by the display without any gain in quality.
 
what i get is that

Normal display: white = max brightness set in the "brightness" TV/monitor option.
HDR: White = whatever brightness (so the white of sky can be brighter than a white piece of paper)

or im completely off the mark?
 
Normal display: white = max brightness set in the "brightness" TV/monitor option.
HDR: White = whatever brightness (so the white of sky can be brighter than a white piece of paper)
Not quite.

On a properly calibrated reference NTSC display pure white = 100 nit (cd/m^2). This equates roughly a piece of white paper under normal lighting.
HDR signaling allows for pure white to be 10,000 nit, which is roughly looking directly at a fluorescent light tube.

The problem is that typical monitors are calibrated to 200 nit for comfortable working. The best consumer TV displays can output roughly 1,500 nit and commercial ones are hitting 5,000 nit.

This means when you take a NTSC SDR signal where pure white is supposed to be 100 nit and display it on a 200 nit monitor or really bright 1,000 nit TV the luminance of the picture gets shifted and looks wrong.

With HDR signaling the source video can say a fluorescent light fixture should be 5,000 nit, a old-time light blub should be 1,000 nit, a piece of paper should be 100 nit, and face should be 1 nit. It's then up to the display to tone map this huge possible dynamic range into something the TV can actually display.

With SDR signaling both lights and the piece of paper would appear as the same brightness.
On a consumer TV you might not be able to tell the difference between the fluorescent light fixture and the light bulb, but you can still tell the difference between the lights and the paper.
On a much higher grade consumer or professional TV you might be able to tell the difference between the lights.
 
games should be fine and do what they've always done?
That's what's happening, in a sense.

CRTs have pretty good static contrast, and the games of that era reflect this; many chose a composition that looks very dull on most LCDs.

Seventh-gen games, largely made for crappy LCDs, often output images that are very torchy when viewed on a CRT that's configured to use its full range.

In this sense, sixth-gen games treated their outputs as "more HDR" in a sense compared with seventh-gen; they're "dull" because they expected the display to reconstruct the signal more vibrantly.

That's basically what HDR games should do. If sending on SDR, map a smaller luminance (and chrominance) range to the signal's range. If sending on HDR, map a bigger luminance (and chrominance) range to the signal's range.
 
Back
Top