HDR on PCs is a failure

A 27" model is absolutely needed, IMO. 30+ is too big for me. Also gets expensive with these big screens...
I thought similar until I got my 34" ultra-wide. It doesn't take up as much vertical which is perfect for my setup but I just love the width of it. 34" is minimum for 21:9 displays IMO, otherwise you lose too much vertical.
 
27" is the absolute minimum for 4K 16:9 (3840x2160) computer monitors - the sweet spot is about 32-42".


TFT Central assumes 31.5" is the minimum to resolve 4K with the normal 20/20 visual acuity, Eizo assumes their 31.5" 4K display would resolve fine image details when sitting 2.0-2.6 feet (60-80 cm) from the screen.
CarltonBale's 4K/8K aware screen distance calculator (based on THX formula AFAIK) would agree with these figures.
Rtings.com optimal viewing distance chart and table assume that at least 35" 4K display is needed for the viewing distance of 2.0-2.5 feet (60-75 cm).


For ultra-wide 5K 21:9 (5120x2160), 34" would be the minimum and 42-50" the sweet spot, and for 8K 16:9 (7840x4320) and ultra-wide 8K 12:5 (7680x3200) and 8K 21:9 (10240x4320) displays, the usual viewing distance of 60-80 cm (2.0-2.6 feet) would require a 65" to 85" screen - yes, a 65" computer monitor!
So if you're lucky to own a $5000 Dell UltraSharp UP3218K, the first 32" 8K monitor in the world, you'd need to sit at 1 feet (30 cm) to actually see all the fine details it can show you.


I'd actually expect wide 30-40" and ultra-wide 40-50" LCD monitors to become much more common in the next 3 years, once those 7G-8G-8.5G glass substrate assemblies from 2005-2007 are finally retired and replaced with more efficient 10G, 10.5G and 11G lines equipped for the latest QDOG (quantum dot on glass) and QDCF (quantum dot color filter) glass panels.
 
Last edited:
They could also be a lot smaller... The less bulky model is over 30 inches diag. and only WQHD rez, while the 4K model is massive and basically totally unsuitable as a desktop monitor. Would look ludicrous, and probably stretch the screen into the periphery of your visual field. Having to turn my head to follow the mouse would get really fricken old really fricken fast.
That wide screen is the equivalent of two 27" 16:9 screens next to each other. That doesn't seem particularly unusual or unusable to me.
 
finally connected our sony BVM to a HDMI 2.0 graphics card and a decent PC, Enabled HDR via the windows display setting...
A bit unsure on what EOTF i should be using on the display though? anyone know?

Also anyonw know of any good software that shows off / demo the HDR colour space well?
 
finally connected our sony BVM to a HDMI 2.0 graphics card and a decent PC, Enabled HDR via the windows display setting...
A bit unsure on what EOTF i should be using on the display though?

Try ST2084 (PQ) first, though Windows should support HLG as well.

I'm not sure how your Sony BVM-X300 reports a change of its EOTF though.
AFAIK the display should indicate support for wide gamut color space and PQ/HLG curves either through relevant EDID/DisplayID extensions, or through an HDMI 2.x InfoFrame and CTA-861 EDID extensions.

Unfortunately Windows Settings do not provide detailed information or manual controls for HDR monitors. You can get some info by running DXDiag and generating a text report file, as described below:
https://support.microsoft.com/en-sg/help/4040263/windows-10-hdr-advanced-color-settings

You can also run an EDID/CEA-861/DisplayID tool see what your monitor actually reports:
http://www.entechtaiwan.com/util/moninfo.shtm


Note that even with a supported HDR monitor, Windows 10 currently does not perform automatic color space/EOTF conversion for applications which are not color management aware and simply assume sRGB (BT.709) color space and Gamma 2.2 (BT.1886) curve - so vast majority of programs would not look correctly when you turn on HDR, including Windows itself...

Also anyonw know of any good software that shows off / demo the HDR colour space well?
None that I know of.

There were plans to add HDR monitor support to 3DMark, but it's not released yet.

High-end PC games and performance benchmarks supported HDR rendering (aka bloom/saturation effects) since early 2000s, but very few (if any) currently support HDR monitors (i.e. 10-bit or floating-point render target, BT.2020 color space, and PQ/HLG transfer curves), not to mention wide-gamut color in art assets and the rendering pipeline.
For now, the final picture is clamped or tone-mapped to sRGB and Gamma 2.2.

DXGI: High Dynamic Range and Wide Color Gamut
DXGI_COLOR_SPACE_TYPE enumeration
DXGI_HDR_METADATA_HDR10 structure

https://developer.nvidia.com/preparing-real-hdr
https://developer.nvidia.com/getting-know-new-hdr
https://developer.nvidia.com/displaying-hdr-nuts-and-bolts
https://developer.nvidia.com/rendering-game-hdr-display
 
Last edited:
I would like to call your attention to my sad attempt to capture Alien Isolation's deep color mode in action. I've been curious for years as to whether it does anything or if you need specific display settings for it. I now own a LG OLED so that has rekindled my fascination with banding.

https://forum.beyond3d.com/threads/alien-isolation-pc.59658/page-2#post-2019602

On a related note it seems like all game gamma calibration screens don't work very well with a screen that can do perfect black. Whenever I calibrate it so the logo or text or whatever is supposed to be invisible, the image seems too dark. Too much black! :)
 
Last edited:
I would like to call your attention to my sad attempt to capture Alien Isolation's deep color mode in action

This 'deep color' mode is just 30/36/48-bit SDR rendering, within the regular sRGB color space using standard gamma 2.2 curve. It does not provide wide color gamut for purer colors, or HDR EOTF curves for inscreased highlight brightness - so the effect will be quite subtle, if at all noticeable.


OTOH, Rise of Tomb Raider does support HDR10 monitors:
https://developer.nvidia.com/implementing-hdr-rise-tomb-raider
(only on NVidia hardware at the time of writing, as Windows 10 DXGI updates were in development)


The effect of HDR10 displays on current generation games would be limited to better highlights and contrast, as HDR10 support is currently achieved through post-processing steps. Increased color gamut would require both rendering and art authoring in a wide gamut color space, such as scRGB.

Nvidia has a detailed white paper on the use of wide color gamut in game production:
https://developer.nvidia.com/high-dynamic-range-display-development
 
Last edited:
This 'deep color' mode is just 30/36/48-bit SDR rendering, within the regular sRGB color space using standard gamma 2.2 curve. It does not provide wide color gamut for purer colors, or HDR EOTF curves for inscreased highlight brightness - so the effect will be quite subtle, if at all noticeable.

The effect of HDR10 displays on current generation games would be limited to better highlights and contrast, as HDR10 support is currently achieved through post-processing steps. Increased color gamut would require both rendering and art authoring in a wide gamut color space, such as scRGB.

Nvidia has a detailed white paper on the use of wide color gamut in game production:
https://developer.nvidia.com/high-dynamic-range-display-development
Alien Isolation in HDR10 would be great without a doubt. RE7 in HDR is quite a visual upgrade over playing it in SDR, and the theme of darkness is shared between the games.

Alien Isolation Deep Color is rather useless indeed. The only noticeable benefit is slightly reduced banding. However, seeing how bad it looked running at YCbCr 4:2:2 8 bpc on PC makes me wonder what the console versions look like on older TVs without 4:4:4 support. Of course the game usually has a lot of film grain to help with banding too which I had disabled.
 
Last edited:
Back
Top