HDR settings uniformity is much needed

RobertR1

Pro
Legend
HDR experience for consumers continues to be a struggle and part of that is due to how it's implemented, naming conventions use for the options (or lack there of) and varying descriptions . Without going into HGIG vs Dynamic Tone Mapping etc, Just having a basic settings menu that's unified across titles should do wonders for people getting a proper HDR experience.

Main caveat: this would be mainly useful for the enthusiasts who have a basic understanding of HDR, picture quality and own some higher end HDR capable display. This is not going to help the person buying a random display within their budget, turning it on and leaving it in default settings.

The three main settings and the names and use cases for each settings should be clear to the user. As an example:

Max brightness. The brightest your display can get when displaying HDR. Check online reviews for accurate information.
EOTF/Gamma. 0 means it's following the standards. A plus value brightens the overall image and a minus value darkens it. Recommended value = 0
HUD elements. This only impacts the HUD elements and not the image.

Using common names and descriptions will create uniformity across the board. The end users can have the confidence that no matter what game they pick up, their HDR tuning remains the same and they can get the best experience. Right now, reading online and seeing people's settings and even worse, interpretation of what the different settings means is tragic. The industry can and needs to step up greatly.

It's low hanging fruit with a lot of upside. So any developers on here and those with influence, please get people onboard with this. To the studios already doing something like, thank you!
 
It's a shit show on PC as most games won't even let you use HDR in a game unless you have it activated on Windows itself.

I can't imagine it's much trouble to give the game the ability to use HDR without it needing to be on in Windows itself first as some games do it.
 
HDR as a display standard is a complete fucking mess.

We don't need max brightness as a user setting, we need it automatically detected and set. We don't need any of these things, other than for them to be automatic.

We also need a display standard that can use adaptive brightness with HDR other than Hybrid Log Gamma, because your viewing conditions vary, and how much screen glare there is determines how dark scenes can get before the viewer can't see them. BTW the "infinite" contrast ratio of OLEDs is pointless, not only do bit depths and darks not go down that low, meaning there's no detail or scene information there, but screen glare/viewing conditions are almost certainly blasting out any detail that could be there anyway.
We also need a way to set a dynamic "mid point" in all media formats, HDR10+ and Dolby have this, but only the upcoming HDR10+ gaming allows this in interactive fashion.
Then we need both a standard tonemapping behavior across all displays to squeeze the given brightness and color information down into the displays actual output standards, one that uses the mid point to give a satisfactory tonemapping down to SDR, while also allowing an optional disable for content creators in case they want to define their own tonemapping behavior (like they're a major Hollywood release and have the money).

There's so much wrong with HDR it's incredible, but needing to set a standard that all device manufacturers, content creators, OS and browser makers together will agree on is an equal mess, and the people that should be in charge have mostly just given up altogether at this point.
 
HDR as a display standard is a complete fucking mess.

There's so much wrong with HDR it's incredible, but needing to set a standard that all device manufacturers, content creators, OS and browser makers together will agree on is an equal mess, and the people that should be in charge have mostly just given up altogether at this point.

I’m just trying to live in the reality of what is possible. All of what you’re saying is possible if all TV’s shipped with a game mode that was following standards in the first place. Even filmmaker mode which is supposedly following a set of standards is poorly implemented left and right. Manufacturers have no intention of doing this so our best course is to unify the user adjustment options, names, use cases and descriptions at the game level.
 
I’m just trying to live in the reality of what is possible. All of what you’re saying is possible if all TV’s shipped with a game mode that was following standards in the first place. Even filmmaker mode which is supposedly following a set of standards is poorly implemented left and right. Manufacturers have no intention of doing this so our best course is to unify the user adjustment options, names, use cases and descriptions at the game level.
on this topic, DV should work though in terms of reproducing the desired colour? I can’t change any colour settings on DV mode
 
on this topic, DV should work though in terms of reproducing the desired colour? I can’t change any colour settings on DV mode

It’s just another format but does dynamic tone mapping on a scene by scene basis. DV is a closed box implementation, that’s why you can’t change much without a workflow built into something like Calman. Even then it’s limited and is a bit of a mess with all the various profiles.
 
Yeah, it's a mess alright...I'd never have guessed that one of my most appreciated features of my Sony OLED would be its (really good) dynamic tone mapper.
 
yup, HDR on PC is a hit and a miss. It's not very solid. I use the Windows HDR Calibration tool, and while it helps, for some reason my hypothetically 1500 nits display can give very different results on different Windows 11 partitions. On one partition, at 1400 nits or so, HDR seems well calibrated, on another Windows 11 partition I have to drag the slider up to 2200 nits so HDR is well calibrated according to the program. Shrug.
 
a very recent article on this.

Its been available for quite a long time, Btw any idea why the writer got that many issue with hdr on windows?

It just works on my windows 11, after proper calibration (years ago with CRU, but nowadays can use windows hdr calibration app). Except for some games with DV... Then sometimes I got all pink or just blank. Oh and some games need to be manually calibrated inside the game.

Yeah... Not really "just work"... But once calibrated, no need to disable hdr.

Windows also have autohdr, making sdr games looks great (hi titanfall 2). That unfortunately doesn't work for all games, fortunately there's specialk.

If only windows have a "force enable auto hdr for games" toggle somewhere....
 
I use the shortcut to toggle HDR before starting up a game. It's just how I prefer the IQ to be and calibration yielded better results that way (HCFR + xrite). Auto HDR is def not a thing for me but it's good that it exists as an option for people.
 
Its been available for quite a long time, Btw any idea why the writer got that many issue with hdr on windows?

It just works on my windows 11, after proper calibration (years ago with CRU, but nowadays can use windows hdr calibration app). Except for some games with DV... Then sometimes I got all pink or just blank. Oh and some games need to be manually calibrated inside the game.

Yeah... Not really "just work"... But once calibrated, no need to disable hdr.

Windows also have autohdr, making sdr games looks great (hi titanfall 2). That unfortunately doesn't work for all games, fortunately there's specialk.

If only windows have a "force enable auto hdr for games" toggle somewhere....
does CRU have a HDR calibration tool? I didn't know until now tbh, and I've been using it for a long while.

As for the issues with HDR in Windows, there are a few. If HDR is enabled on Windows, some games break it, like Doom 3. Other times either the calibration or the HDR implementation in certain games is pretty poor. If HDR is enabled, taking screengrabs of any non HDR app or on Windows desktop can result in washed out pictures. Mysteriously enough, that doesn't happen sometimes under the same conditions, but it happens some other times.

Doom Eternal has a fine calibration tool for HDR, but if you use Vulkan, the results can be very strange. Etc etc etc.
 
does CRU have a HDR calibration tool? I didn't know until now tbh, and I've been using it for a long while.

As for the issues with HDR in Windows, there are a few. If HDR is enabled on Windows, some games break it, like Doom 3. Other times either the calibration or the HDR implementation in certain games is pretty poor. If HDR is enabled, taking screengrabs of any non HDR app or on Windows desktop can result in washed out pictures. Mysteriously enough, that doesn't happen sometimes under the same conditions, but it happens some other times.

Doom Eternal has a fine calibration tool for HDR, but if you use Vulkan, the results can be very strange. Etc etc etc.

Cru can properly add the nits parameter to the display info before windows hdr calibration tool was released.

Ideally all games should read that parameter, just like on Xbox.
 
Pretty sure it's based on whitelist and that there's a reason for it
its based on whitelist or flip bit swap thingy i cant remember the name. for most incompatible games, you just switch that fllip swap bit thingy and abracadabra, autohdr works fine.

for some that still wont trigger autohdr, need to use SpecialK's auto HDR.
 
Also novideo_srgb 4.1 (nvidia only) added support for L* EOTF as a calibration option. I'm not sure if that's an HDR option, or how you'd take advantage of it. I use novideo_srgb to get pretty much perfect srgb on my display, which is natively wide gamut, and proper dithering support.


You can select a target and then select a gamma calibration. Don't know if it truly handles hdr though. Just not sure what L* EOTF is. I know L is luminance, but not sure what the * means.
1704141474768.png

1704141515578.png
 
It's not just devs that need to get their act together it's monitor manufactures as well theres god knows how many standards
 

Looks like an interesting piece of software.

thanks, when I have the time I am going to take a look at that tutorial. I retired my 1440p 165Hz monitor from displaying HDR content, since it is a HDR 400 (or 500) monitor, but the HDR is mediocre no matter what, and on SDR mode it produces a much more pleasing image.

That being said, my TV produces a decent HDR -allegedly HDR 1500, but it's more like 900nits in actuality-, and I use the Windows Calibration tool.

Btw, @Nesh you mentioned once that it's not easy to discern HDR from SDR, and that's right, 'cos in person if you are fed a HDR image and then a SDR image, it's not easy to notice the difference save the more intense colours, and that's it, at least for me.

I took some pictures on a ultrawide screen, 'cos Edge and Google browsers accept HDR, but Firefox isn't compatible with HDR, so the video is shown in SDR mode.

So to the left is Edge playing a HDR video, and to the right the same video being played on Firefox.

The images where taken with my mid-low range phone, so some places are overbright or colours aren't perfect, but you get the idea.

7pumeYq.jpeg


Note the extra detail on the buildings in the bottom right of the image in HDR and how that is completely lost in SDR, also the extra detail on the tower in the middle and so on and so forth.

H3TgHYa.jpeg


o5TYTXd.jpeg


sDcx7CR.jpeg


More intense colours, image not looking washed out.

TdFmH1i.jpeg


Much better shadowing and clouds.

9onwxaw.jpeg


While the phone doesn't make it justice, note the better shadowing, the extra details in the sky -a bit lost 'cos of the phone photo, but anyways- and how much washed out the screengrab looks in SDR.

QGCoQBG.jpeg
 
Back
Top