HDR settings uniformity is much needed

after 5 to 6 years using HDR display only with HDR enabled like 99,9% of the time, and now that I've had the AW2523HF for about a month, which isn't HDR certified but has HDR but I don't bother using it....

The main difference I notice is that using SDR the contrast feels more like the bad shadowing created by games on Ultra settings using the most "advanced" AO in the past. The contrast, especially in dark zones, looks bad.

HDR looks better overall. That being said, I can enable HDR on the AW2523HF but I prefer not to, it adds a lot of hassle for certain things like taking screengrabs and the HDR image on this particular monitor isn't that great to begin with, so SDR is the way to go depending on the screen.
 
Last edited:
HDR looks so good when it works though. I didn’t realize how flat SDR looks in comparison. I’m really questioning the need for super bright displays though. Even at 500 nits I find myself squinting at highlights. Can’t imagine why we need 1000 or god forbid 10000 nit monitors.
100% agreed. My monitor is rated for 1000nits and in practice I think it goes even higher. It is devastating when a bright flash happens in games. 400nits is even too much in a dark room. Which is the lowest some games will allow me to go while keeping HDR on. Maybe 500+nits is okay for a television that is a 10ft away, but not for a monitor in your face.

All that said when the HDR is done right it is transformational. Like when I saw an HDTV for the first time at a state fair.

Edit somehow didn't notice trinibwoy's post was a couple pages back when I replied to it.
 
after 5 to 6 years using HDR display only with HDR enabled like 99,9% of the time, and now that I've had the AW2523HF which isn't HDR certified but has HDR but I don't bother using it....

The main difference I notice is that using SDR the contrast feels more like the bad shadowing created by games on Ultra settings using the most "advanced" AO in the past. The contrast, especially in dark zones, looks bad.

HDR looks better overall. That being said, I can enable HDR on the AW2523HF but I prefer not to, it adds a lot of hassle for certain things like taking screengrabs and the image on this particular monitor isn't that great to begin with, so SDR is the way to go depending on the screen.

That display doesn't have local dimming, so it can't really do HDR. It's an IPS with pretty standard contrast for that panel type at a little more than 1000:1. It won't be the best in really dark games. It has a slightly wide gamut, so finding a way to clamp that will help a bit with the image. Rtings has an icc file for it, but that won't work with fullscreen games. You may need to adjust the RGB settings a bit. They used 100-98-95 for R-G-B. That'll help get the colour temperature correct. I also wouldn't crank up the brightness because on IPS screens you'll typically see the backlight glowing. Turn it down to something reasonable, maybe 50 or slightly higher.
 
That display doesn't have local dimming, so it can't really do HDR. It's an IPS with pretty standard contrast for that panel type at a little more than 1000:1. It won't be the best in really dark games. It has a slightly wide gamut, so finding a way to clamp that will help a bit with the image. Rtings has an icc file for it, but that won't work with fullscreen games. You may need to adjust the RGB settings a bit. They used 100-98-95 for R-G-B. That'll help get the colour temperature correct. I also wouldn't crank up the brightness because on IPS screens you'll typically see the backlight glowing. Turn it down to something reasonable, maybe 50 or slightly higher.
thanks! Applied those settings and downloaded the ICC profile, but only for SDR. Looks good. Tested Shadow of the Tomb Raider for a few minutes and I can't complain tbh

I also enabled HDR and yes, the colours look much more intense, the brightness too, but dark areas are meh, still better than SDR if the monitor had a decent contrast but this one doesn't. I've had a i7-7700-HQ laptop with a 1050Ti 4GB in the past and it also had an IPS panel, the contrast was bad too... A 4K TV I have features much much better contrast, but other than that it's too much of a screen for my PC, and me...

By Fullscreen do you mean Exclusive Fullscreen or Borderless FullScreen? I use borderless on all the games or Windowed, I don't like exclusive fullscreen that much.

How can you clamp the colour gamut? I wouldn't mind playing using HDR as long as the colours look fine at least. On SDR mode this monitor has nice colours, not as vibrant as in HDR mode but really solidly good.
 
thanks! Applied those settings and downloaded the ICC profile, but only for SDR. Looks good. Tested Shadow of the Tomb Raider for a few minutes and I can't complain tbh

I also enabled HDR and yes, the colours look much more intense, the brightness too, but dark areas are meh, still better than SDR if the monitor had a decent contrast but this one doesn't. I've had a i7-7700-HQ laptop with a 1050Ti 4GB in the past and it also had an IPS panel, the contrast was bad too... A 4K TV I have features much much better contrast, but other than that it's too much of a screen for my PC, and me...

By Fullscreen do you mean Exclusive Fullscreen or Borderless FullScreen? I use borderless on all the games or Windowed, I don't like exclusive fullscreen that much.

How can you clamp the colour gamut? I wouldn't mind playing using HDR as long as the colours look fine at least. On SDR mode this monitor has nice colours, not as vibrant as in HDR mode but really solidly good.

I would not use HDR. The monitor is not really HDR at all. It doesn't get bright enough for highlights and it has no local dimming. The max contrast is around 1000:1, which is not HDR at all. Turning on HDR is just going to look broken. Better to get a proper SDR experience.

An icc profile may work in borderless/windowed fullscreen, but probably not. Definitely not the exclusive fullscreen, even though exclusive fullscreen doesn't truly exist anymore. Most games bypass the window composition which would apply the colour correction from the icc.

In terms of an srgb clamp, there's a way to do it in AMD drivers. Nvidia requires a tool, novideo_srgb. For intel I have no idea.
 
It'd be nicer if Windows handles HDR better. For example, many monitors do not show SDR contents well in HDR mode, so it's generally not recommended to turn HDR on all the time (some monitors are better but some are very bad). However, this means whenever you want to use HDR you need to manually turn it on. Fortunately now there's a keyboard shortcut to do so (windows+alt+B) but it's not ideal.
Ideally monitors should be able to handle all contents properly with HDR on. The problem seems to be that Windows can't output SDR sRGB content with HDR in mind, thus causing many SDR contents to be less bright or less saturated with HDR on. I don't know if this is a strictly Windows problem or maybe both, but I think there really should be a better solution.
 
It'd be nicer if Windows handles HDR better. For example, many monitors do not show SDR contents well in HDR mode, so it's generally not recommended to turn HDR on all the time (some monitors are better but some are very bad). However, this means whenever you want to use HDR you need to manually turn it on. Fortunately now there's a keyboard shortcut to do so (windows+alt+B) but it's not ideal.
Ideally monitors should be able to handle all contents properly with HDR on. The problem seems to be that Windows can't output SDR sRGB content with HDR in mind, thus causing many SDR contents to be less bright or less saturated with HDR on. I don't know if this is a strictly Windows problem or maybe both, but I think there really should be a better solution.

Streaming devices like Apple TV for example have a feature called match dynamic range. When hdr is detected it switches the panel to hdr. That’s what windows needs.

Also having to download the hdr calibration app which is essential is a bit clowny. It should be native to the os. Shows that hdr is a tack on to windows thus it behaves the way it does.
 
It'd be nicer if Windows handles HDR better. For example, many monitors do not show SDR contents well in HDR mode, so it's generally not recommended to turn HDR on all the time (some monitors are better but some are very bad). However, this means whenever you want to use HDR you need to manually turn it on. Fortunately now there's a keyboard shortcut to do so (windows+alt+B) but it's not ideal.
Ideally monitors should be able to handle all contents properly with HDR on. The problem seems to be that Windows can't output SDR sRGB content with HDR in mind, thus causing many SDR contents to be less bright or less saturated with HDR on. I don't know if this is a strictly Windows problem or maybe both, but I think there really should be a better solution.
I've started using a Steam 'chord' so pressing the guide button and down on the controller toggles HDR but only really because I play on a TV with a controller and the keyboard is out of reach. But yes, it's annoying
 
Yeah I’m not referring to the bogus HDR displays. HDR mastering is supposedly done for dark environments but I can’t imagine looking at a 1000 nit monitor in a dark room. I’m currently running VESA Display HDR 400 in a dark room and highlights are already blinding.
Sometimes I'll play something like Forza Horizon 5 on a 1500 nit display, late at night, in a dim room and it stops me sleeping because my brain thinks it's still the afternoon :ROFLMAO:
 
after more than a month using an IPS display with SDR -it has HDR support, but very mediocre- I decided to plug my PC to the 4K TV which has a good HDR and much better contrast and turned HDR on.

Well, the difference is staggering, with decent HDR content. Colours are more intense, and the highlights are much more noticeable, not to mention contrast -but that's to be expected when compared to a IPS monitor without local dimming-. It's a totally different look. Still, the SDR mode on the IPS monitor is quite good, not to mention the super high framerate.
 
the news below is related to HDR, on Linux. 3 to 4 years ago when I had installed Linux exclusively for a few months, for testing purposes, they were working on implementing HDR on Wayland, but it was still experimental.

I tried Wayland when using multiple displays and different scaling factors on Manjaro, and it worked like a charm.

What surprised me is that there was a HDR checkbox, I enabled it on my TV and it worked. So they are implementing HDR natively too. Sometimes people make things better because of a passion for something, maybe that will fix certain issues with HDR nowadays when using computers.




edit: not HDR related.
 
Last edited:
Are you sure?

“On a similar note, hdrtest isn’t anything to do with High Dynamic Range (HDR) graphics, but is used by devs for verifying the integrity and self-containment of DRM header files.”
gosh you got me, my bad, I totally misinterpreted the new. Sigh.

HDR works fine in Linux if you use the Wayland desktop UI though. Sry for any inconvenience I might have cause to anyone reading my original post.
 
Back
Top