HDR settings uniformity is much needed

The funny thing is you’d think the consoles were better at standardizing their system/TV-specific calibration but even there games insist on using their own sliders and tone mapping.

Maybe we just leave it to the TV? This is essentially what movies do, the filmmakers master with whatever brightness they see fit and pass that along to the TV to tone map, if its DV or HDR10+ then they add in metadata too. I never have to drag sliders for regular HDR10 movies haha.

That’s what should happen but it never does. I’d love nothing more than this thread to never exist.
 
I honestly do not understand why developers/gamers want tonemapping/calibration sliders in games in the first place. I suppose you could argue for multiplayer consistency but even that is a weak argument imo. Maybe there's something I'm just not getting.

What I find frustrating listening to people talk about HDR 'calibration' is all everyone seems to focus on is getting more definition in the uppermost portion of the image, namely, the clouds! As if that's the be-all-and-end-all and isn't detrimental to other aspects of the image, namely the overall brightness of the scene.
 
I honestly do not understand why developers/gamers want tonemapping/calibration sliders in games in the first place. I suppose you could argue for multiplayer consistency but even that is a weak argument imo. Maybe there's something I'm just not getting.

What I find frustrating listening to people talk about HDR 'calibration' is all everyone seems to focus on is getting more definition in the uppermost portion of the image, namely, the clouds! As if that's the be-all-and-end-all and isn't detrimental to other aspects of the image, namely the overall brightness of the scene.
Right? Just master the game to 200 nits or whatever for SDR and 10000 nits for HDR10 and let the display tone map to their brightness ranges accordingly. The display is going to have a much better idea of the brightness range of the panel anyways, these little sliders don't really do much because they rarely show test patterns and people don't have a reference for a 'correct' picture so the calibration usually ends up wrong.
 
Right? Just master the game to 200 nits or whatever for SDR and 10000 nits for HDR10 and let the display tone map to their brightness ranges accordingly. The display is going to have a much better idea of the brightness range of the panel anyways, these little sliders don't really do much because they rarely show test patterns and people don't have a reference for a 'correct' picture so the calibration usually ends up wrong.
I've seen developers getting shouted at because their game doesn't have them. I imagine if you asked that person what their purpose was you'd get a blank stare. I can't help but feel that people treat them like the patterns we used to use to calibrate SDR Contrast to prevent white crush. Probably why everyone is obsessed with HDR clouds :LOL:

I'll be happy if Windows would let me play HDR games in HDR and SDR games in SDR without flipping a damn toggle.

Also, The Ascent looks incredible with RTX HDR.
 
HDR stuff on Windows can be very flimsy and sometimes it has issues enabling and disabling itself from time to time.

That being said, I compared my HDR 500 32" VA 1440p monitor vs my allegedly HDR 1500 QLED 50" 4K TV.

The main difference I noticed is the brighter areas, or areas where white is most present, the HDR 500 lost the detail, while the QLED showed much more detail.

Seeing more easily in the gull image -though my phone don't make good enough pictures-, the images where clouds are involved. The biggest difference for me was the abandoned building in the forest and the clouds over the trees. The clouds on the TV looked really good, showing the "nooks and crannies" in between the clouds, but almost all that detail is lost on the 1440p VA HDR 500 panel.

Still, the phone is what it is and the images don't show the differences clearly enough compared to real life. Ohter than that, the differences weren't staggering.

P1bfgmJ.jpeg

UdpSTGN.jpeg


This image below is one of those where the differences are more noticeable, although most of the clouds details is lost when the phone took the photo.
7xhrB8t.jpeg


The feathers on the gull's head where it gets whiter can't be seen on the HDR 500 panel, but can be seen on the HDR 1500 TV.
m86Dg4c.jpeg
nEgymx2.jpeg
The sun looks brighter on the 50" TV and there is some extra detail around its orange glow.
UrXEwRc.jpeg
As usual, the clouds show the biggest differences given their white and black contrasts.
fRiJ53G.jpeg
MvxbPkH.jpeg
The girl's face below looks totally pale in the 1440p monitor, but it has more detail on the TV and her entire face doesn't look fully pale.
PH1hB1L.jpeg
Clouds and more clouds.
lGnZkzf.jpeg
UNQiyxh.jpeg
Again, the difference in the feathers is noticeable, where there is more brightness the HDR 500 monitor loses detail.
cLbVLS9.jpeg

Still, the differences aren't staggering in real life, I mean, I expected the monitor to look a lot less colourful and so on.
 
Last edited by a moderator:
Btw hdr on Android is also shitty, even on Android 15 google pixel 6.

It requires per-app hdr flag. Even chrome doesn't have hdr enabled.

Instagram hdr is region locked
 
@Cyan HDR isn’t meant to blind. The reason hdr is for dark room viewing is so the viewers can see the delta between a dark section of the room while a light in the corner for the example burns at a high intensity in that corner without lifting the overall scene’s apl (average picture level).

This is unfortunately why hdr doesn’t have the 3D impact on lcd based tech as oled based. To really get this depth you need a high level of contrast to the image. With backlights and zones, there’s always a compromise in contrast when you need to display both dark and super bright elements on the screen.
 
@Cyan
This is unfortunately why hdr doesn’t have the 3D impact on lcd based tech as oled based. To really get this depth you need a high level of contrast to the image. With backlights and zones, there’s always a compromise in contrast when you need to display both dark and super bright elements on the screen.
Do you only watch black and white movies?! A High-End MiniLED can provide much better contrast than a OLED when the deepest brightness level is not zero...
My Hisense with VA (nativ 6000:1) and 5120 zones can display nearly 4000 nits at 10% and 1000 nits at 1000%. The MiniLED backlight can go very low, too. Watch Dune Part Two yesterday and it was fantastic.
 
Do you only watch black and white movies?! A High-End MiniLED can provide much better contrast than a OLED when the deepest brightness level is not zero...
My Hisense with VA (nativ 6000:1) and 5120 zones can display nearly 4000 nits at 10% and 1000 nits at 1000%. The MiniLED backlight can go very low, too. Watch Dune Part Two yesterday and it was fantastic.

Wow which TV is that?

I tried RTX HDR on my LG OLED for the first time and I have no idea how the average person figures out this stuff. Theres a staggering amount of conflicting and confusing settings between the game, windows and display. And that’s after trying to understand the basics of how HDR is supposed to work.

They really need to improve the interfaces so that devices and software can talk to each other and auto configure themselves.
 
Wow which TV is that?

I tried RTX HDR on my LG OLED for the first time and I have no idea how the average person figures out this stuff. Theres a staggering amount of conflicting and confusing settings between the game, windows and display. And that’s after trying to understand the basics of how HDR is supposed to work.

They really need to improve the interfaces so that devices and software can talk to each other and auto configure themselves.
very true that! The Windows HDR Calibration tool is a mess. It kinda works but on the same session you can get a "calibrated" HDR experience with a value of 2700, and once you complete that you try again, and the calibration now gives you a value of 1500 as the best value for your display. It's so random.

Do you only watch black and white movies?! A High-End MiniLED can provide much better contrast than a OLED when the deepest brightness level is not zero...
My Hisense with VA (nativ 6000:1) and 5120 zones can display nearly 4000 nits at 10% and 1000 nits at 1000%. The MiniLED backlight can go very low, too. Watch Dune Part Two yesterday and it was fantastic.
very interesting... I am doubtful about what kind of display to get in the future. I am not in a hurry.

When I go to malls, which isn't often as I live in a very rural isolated area, I stop and spend a lot of time in the TVs and displays section of the mall looking at the demos of OLED TVs, like a kid green with envy and truly wishing to have that.

They look good, but I can't say I feel awestruck. Maybe it's not the best context.
 
I tried RTX HDR on my LG OLED for the first time and I have no idea how the average person figures out this stuff. Theres a staggering amount of conflicting and confusing settings between the game, windows and display. And that’s after trying to understand the basics of how HDR is supposed to work.

They really need to improve the interfaces so that devices and software can talk to each other and auto configure themselves.
Configuring HDR is incredibly confusing and frustrating. RTX HDR would only work if I disconnected my second monitor. I was able to plug it into the mobo IGP and still use RTX HDR.
 
I haven’t seen this yet. Any idea which SDR content has the issue?

Almost any sdr content is going to be created to be viewed of a flat 2.2 gamma screen with the exception of movies which will be 2.4 or bt1886. Not sure if anything is intended to be viewed on the srgb tone curve. Viewing 2.2 content on srgb tone curve will raise blacks and make it look slightly washed out.

Windows colour management and standards are a mess.
 
Configuring HDR is incredibly confusing and frustrating. RTX HDR would only work if I disconnected my second monitor. I was able to plug it into the mobo IGP and still use RTX HDR.
It only need 2nd monitor disconnected for rtx hdr for games. Rtx hdr for video works fine with multiple monitors connected to the Nvidia gpu.

Not sure why Nvidia make it like that. Windows auto hdr works fine with 2nd monitor connected
 
It only need 2nd monitor disconnected for rtx hdr for games. Rtx hdr for video works fine with multiple monitors connected to the Nvidia gpu.

Not sure why Nvidia make it like that. Windows auto hdr works fine with 2nd monitor connected
It's even stranger that it works for some things but not for games. I think many people have multiple monitors now so not supporting it is a deal breaker.

That said I can't use RTX HDR at night anyway. The menus and stuff in games is blindingly bright on my 1000nit monitor. Games with proper HDR support know not to apply superbrightness to stuff like the UI.

When HDR is properly supported it's fantastic, but had I known what a clusterfuck it is I would have gotten a faster IPS panel instead.
 
Almost any sdr content is going to be created to be viewed of a flat 2.2 gamma screen with the exception of movies which will be 2.4 or bt1886. Not sure if anything is intended to be viewed on the srgb tone curve. Viewing 2.2 content on srgb tone curve will raise blacks and make it look slightly washed out.

Windows colour management and standards are a mess.

Tried AC Unity on my OLED TV @ 800 nits. Turning on HDR didn't look bad at all except the sky was a bit blown out. Auto HDR fixed it a little but introduced tons of banding in the sky. RTX HDR was better and had less banding. I didn't really see any HDR pop in any of the modes though. Maybe my TV isn't bright enough. Playing in SDR looked better than all 3 of the HDR attempts. Oh well.
 
Back
Top