HDR revisited

Discussion in 'Rendering Technology and APIs' started by Arwin, Sep 9, 2016.

  1. Arwin

    Arwin Now Officially a Top 10 Poster
    Moderator Legend

    Joined:
    May 17, 2006
    Messages:
    17,674
    Likes Received:
    1,194
    Location:
    Maastricht, The Netherlands
    HDR is back in a big way. Lots of stuff is coming out that makes a big deal out of it (including the consoles now). But what does it really mean?

    Personally, I don't quite understand why TVs should know anything more than to be able to understand a 12bit per color video source and map that to their full contrast range. Everything else that gives hints about the source material's 'HDR' surely is just adding information back that has at some point gone missing in the source material? So I could understand if you add HDR information for a certain movie that was encoded with 8bit precistion, but otherwise I don't see any advantages. Meaning that if a GPU can do 10-12 bit color and output that, games should be fine and do what they've always done?

    Can anyone 'enlighten' me?
     
  2. Rufus

    Newcomer

    Joined:
    Oct 25, 2006
    Messages:
    246
    Likes Received:
    60
    There are 2 components that are coming with UHD/HDR at once: color space and dynamic range.

    Color space defines what the "most red" color that can be displayed is. Most displays have historically been Rec 709/sRGB while UHD is Rec 2020. The color space between them is drastically different. If you take an image from one color space to another it'll look completely wrong as colors shift around.
    https://en.wikipedia.org/wiki/Rec._709
    https://en.wikipedia.org/wiki/Rec._2020

    Dynamic range is based on the EOTF (Electro-Optical Transfer Function) that maps digital code value to displayed luminance. This is what translates what a digital 0/128/255 means into physical brightness. The current EOTF, and the concept of "gamma", was standardized based on the physics of CRTs. If you simply extend the existing SDR gamma from 8 to 10 bits you go from 6 to 10 stops of dynamic range. Using the new Hybrid Log-Gamma transfer function the same 10 bits of HDR gives 17.6 stops of dynamic range.
    https://en.wikipedia.org/wiki/Hybrid_Log-Gamma

    This presentation has some more details on the guts if you're interested:
    https://www.smpte.org/sites/default/files/2014-05-06-EOTF-Miller-1-2-handout.pdf
     
  3. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    14,770
    Likes Received:
    2,191
    119.88 fps wtf...
     
  4. AlBran

    AlBran Ferro-Fibrous
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    20,582
    Likes Received:
    5,680
    Location:
    ಠ_ಠ
    29.97 *4 I think

    Thanks Legacy Support.
     
  5. Rufus

    Newcomer

    Joined:
    Oct 25, 2006
    Messages:
    246
    Likes Received:
    60
    #5 Rufus, Sep 9, 2016
    Last edited: Sep 9, 2016
  6. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    14,770
    Likes Received:
    2,191
    Ahh,
    why didnt they just broadcast a 30.03 then we would of got a sensible 30fps
     
  7. Rufus

    Newcomer

    Joined:
    Oct 25, 2006
    Messages:
    246
    Likes Received:
    60
    Here's another really good presentation about HDR, gamma, and tone mapping in video games from 2010 about Uncharted 2. It's a long presentation, but packed with good information.
    http://filmicgames.com/Downloads/GDC_2010/Uncharted2-Hdr-Lighting.pptx

    The first section does a good job of walking through the need for gamma in displays (vastly better contrast) and how games need to compensate for it. Slide 33 is a really good summary of continuously applying gamma adjustments so that the final image produced is correct.
    The second section walks through the process of tone mapping, which is converting an higher dynamic range image (most engines use fp16 internally, which has 30 stops of dynamic range) into a lower-dynamic range image (standard SDR is 6 stops) without ugly clamping or highlights blowing out.

    For SDR displays the process is tonemap to reduce the dynamic range and then gamma correction so things looks correct.

    For HDR displays both of these steps needs to be rewritten:
    - HDR's dynamic range of at least 17 stops (for 10-bit hybrid log, 12-bit PQ is 20 stops) is wide enough that only minimal tone mapping, if any, is required
    - gamma has been replaced with hybrid-log gamma, so all the gamma calculations need to be completely redone

    For computers / consoles there's the obvious question of why can't they just output their internal 16-bit linear-space floating point value and not have to deal with any of this going forward. Page 10 and 14 of the pdf I link above show why - using either log (floating point) or gamma based representation results in a wasting a ton of bits in either the shadows or highlights. That would require drastically more data to be transferred and processed by the display without any gain in quality.
     
    Arwin likes this.
  8. orangpelupa

    orangpelupa Elite Bug Hunter
    Legend Veteran

    Joined:
    Oct 14, 2008
    Messages:
    6,970
    Likes Received:
    1,204
    what i get is that

    Normal display: white = max brightness set in the "brightness" TV/monitor option.
    HDR: White = whatever brightness (so the white of sky can be brighter than a white piece of paper)

    or im completely off the mark?
     
  9. Rufus

    Newcomer

    Joined:
    Oct 25, 2006
    Messages:
    246
    Likes Received:
    60
    Not quite.

    On a properly calibrated reference NTSC display pure white = 100 nit (cd/m^2). This equates roughly a piece of white paper under normal lighting.
    HDR signaling allows for pure white to be 10,000 nit, which is roughly looking directly at a fluorescent light tube.

    The problem is that typical monitors are calibrated to 200 nit for comfortable working. The best consumer TV displays can output roughly 1,500 nit and commercial ones are hitting 5,000 nit.

    This means when you take a NTSC SDR signal where pure white is supposed to be 100 nit and display it on a 200 nit monitor or really bright 1,000 nit TV the luminance of the picture gets shifted and looks wrong.

    With HDR signaling the source video can say a fluorescent light fixture should be 5,000 nit, a old-time light blub should be 1,000 nit, a piece of paper should be 100 nit, and face should be 1 nit. It's then up to the display to tone map this huge possible dynamic range into something the TV can actually display.

    With SDR signaling both lights and the piece of paper would appear as the same brightness.
    On a consumer TV you might not be able to tell the difference between the fluorescent light fixture and the light bulb, but you can still tell the difference between the lights and the paper.
    On a much higher grade consumer or professional TV you might be able to tell the difference between the lights.
     
    iroboto, Arwin, Scott_Arm and 4 others like this.
  10. HTupolev

    Regular

    Joined:
    Dec 8, 2012
    Messages:
    936
    Likes Received:
    564
    That's what's happening, in a sense.

    CRTs have pretty good static contrast, and the games of that era reflect this; many chose a composition that looks very dull on most LCDs.

    Seventh-gen games, largely made for crappy LCDs, often output images that are very torchy when viewed on a CRT that's configured to use its full range.

    In this sense, sixth-gen games treated their outputs as "more HDR" in a sense compared with seventh-gen; they're "dull" because they expected the display to reconstruct the signal more vibrantly.

    That's basically what HDR games should do. If sending on SDR, map a smaller luminance (and chrominance) range to the signal's range. If sending on HDR, map a bigger luminance (and chrominance) range to the signal's range.
     
    DavidGraham likes this.
  11. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    14,770
    Likes Received:
    2,191
    Is that why nvidia drivers have digital vibrance control ?
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...