HDMI/Displayport - display specs confusion for HDR gaming

That's only a difference of 30 Mbits per second. i think its just because its a non standard input, regular uhd wastes a lot of bandwidth on blanking when no uhd devices require it. im not sure actually why the eizo would arbitrarily support one refresh rate and not the other.
 
Last edited:
Clearing HDR Confusion in PC Display Market
September 27, 2018
The DisplayHDR 1000 performance tier represents the current state of the art in PC displays. Beyond the significant and immediately visible luminance difference of 1000 versus 600, the specification imposes higher full screen luminance requirements upon these displays.

Furthermore, the contrast ratio requirement at the 1000 level is double that of the 600 tier. Many display manufacturers have consequentially chosen to increase the number of local dimming zones to achieve this increased contrast ratio, since local dimming performance is improved as the number of zones is increased.

Based on the marketing message around HDR performance characteristics found in the TV market, some consumers may consider 1000 nits to be the minimum brightness level required for HDR PC displays and that anything less is not truly HDR. They might be surprised to learn that many HDR TVs would not even pass the DisplayHDR 600 spec.

Additionally, it is important to recognize that HDR PC and HDR TV displays are not viewed in the same way, and thus a comparison strictly based on brightness level is insufficient.

For example, the ideal viewing distance for display devices at 4K resolution is only 1.5X the diagonal screen size. For a monitor, this is in the sweet spot of where users actually sit. Unfortunately for TVs, users are almost always seated much more than 1.5X the diagonal screen size away from the screen and thus cannot fully benefit from the contrast and detail that high resolution screens provide.
https://www.eetimes.com/author.asp?section_id=36&doc_id=1333797&_mc=RSS_EET_EDT
 
That's really, really cool! I know I had a really difficult time evaluating various displays when I was recently purchasing a 4k HDR TV. Having a benchmark system like this in place would definitely make that process far simpler.
DisplayHDR-system isn't exactly news though, it's been in use for a while now. Not all choose to participate though, I think, but the official list is available at https://displayhdr.org/
 
What is the display bandwidth required for 4K HDR 60Hz 4:4:4, it is below Displayport 1.2 but above HDMI2.0?
Sorry for the late answer.

HDMI devices typically use predefined fixed timings in all supported video modes; these timings are defined by CTA-861 (formerly CEA-861) standard and are based on extrapolation of common video modes used by consumer TV/video equipment from the CRT era.

For 2160p60 (or 59.97) mode, the pixel clock would be 594 MHz so if you send RGB or 4:4:4 YCbCr data with 24 bits per pixel (8 bits per color component), this requires effective data rate of 14.26 Gbit/s - this is right within the maximum of 14.4 Gbit/s and 600 MHz in HDMI 2.0 (the maximum raw bandwidth is 18 Gbit/s when you consider the 10b/8b character encoding scheme).

If you need 10-bit color, you would exceed that maximum bandwidth with 17.82 Gbit/s, as each pixel now takes 30 bits - so 10-bit would only be possible with 4:2:2 subsampling which reduces the data rate by one third, or 4:2:0 subsampling which halves the data rate.


DisplayPort timings are typically based on CVT-R2 from VESA, which uses a timing formula with minimal blanking intervals suitable for modern fixed pixel displays. To calculate CVT timings and blank intervals, you can download a free Excel spreadsheet from the free standards section of the VESA site.

PC 3840x2160@60Hz mode (VESA 8.29M-R) requires a pixel clock of 522.01 MHz and effective data rate of 12.53 Gbit/s with 24-bit pixels - well below the maximum of 17.28 Gbit/s (21.6 Gbit/s raw bandwidth with 10b/8b encoding) offered by DisplayPort 1.2 HBR2, and even 30-bit color per pixel (10-bit per color component) would only require 15.66 Gbit/s which still fits within DP 1.2 limits.


So your "HDR 10bit/60hz/4:4:4/60fps/4K" monitor requires at least 15.66 Gbit/s which is well within the bandwidth offered by DisplayPort 1.2 but exceeds the maximum bandwidth possible with HDMI 2.0.


HDMI 2160p100 mode has 1188 MHz pixel clock which requires 28.51 Gbit/s for 24-bit pixels, so it would need 4:2:0 subsampling over HDMI 2.0.

PC 3840x2160@100Hz mode using CVT-R2 timings would use 887 MHz clock, so 24-bit pixels would require 21.29 Gbit/s, which is well below the maxumum 25.92 Gbit/s offered by DisplayPort 1.3 HBR3, but 30-bit pixels would need 26.61 Gbit/s which is slightly above that maximum, so it would require a sligtly reduced vertical refresh rate of 96/97 Hz (or chroma subsampling if you absolutely need 100Hz).


Hope that clears the confusion for you.
 
Last edited:
found this handy calculator...not too sure of its accuracy, so 1.2a just about make it under the maximum bandwidth of 20gbps....
https://www.extron.com/product/videotools.aspx
This seems quite accurate and the blanking intervals seem to be correct in the advanced mode, but it only supports CVT-RB and not the latest CVT-R2 for VESA timings.

It also shows the total physical bandwidth (after 10b/8b channel encoding) - this would be 25% higher (10/8=1.25) than the effective data rate as calculated by multiplication of pixel-clock by bits-per-pixel.
 
Back
Top