HDR settings uniformity is much needed

Isn't this what Auto HDR is supposed to do?
too bad that that app needs a nVidia GPU. Auto HDR works and does an okay job, but MS' HDR tools show wobbly results.

The Windows HDR Calibration tool is an example of this. I have the exact same OS, Windows 11, in two partitions. (one for gaming exclusively and one for productivity)

Yet, in one partition the HDR Calibration results totally differ compared to the HDR Calibration results in the other partition.

I.e. in one Windows 11 partition I get 2070 nits luminance as the perfect spot or so when it seems calibrated. yet in the other partition Windows HDR Calibration tells me that the perfect value is 2780 nits (almost the max).

Also a year ago, when it seemed to work better, in one partition I got 1480 nits (more in tune with the actual 1500HDR claimed by my TV) but never again, also this only happened in one partition, when I got 1480 nits the tool in the other partition gave me 2280 nits or so as the ideal. šŸ˜‘ So, go figure....
 

Looks like an interesting piece of software.

followed the video step by step, downloaded the software and then alas I realized it's only meant for LG OLED TVs. It is good to have alternatives to Windows HDR Calibration, 'cos the numbers aren't very reliable.

Minutes ago I used Windows HDR calibration, when I started, the max luminance setting according to calibration was 2180 nits and I was thinkiing like it wasn't that bad, it was an improvement over the recommended 2700 something I had before.

In the same session without finishing nor closing HDR Calibration tool, I went back to the main menu and started again, and then bam, the max luminance went up to 2780 nits -20 nits less than max, 2800-. So yeah, kinda unreliable.
 
followed the video step by step, downloaded the software and then alas I realized it's only meant for LG OLED TVs. It is good to have alternatives to Windows HDR Calibration, 'cos the numbers aren't very reliable.

Minutes ago I used Windows HDR calibration, when I started, the max luminance setting according to calibration was 2180 nits and I was thinkiing like it wasn't that bad, it was an improvement over the recommended 2700 something I had before.

In the same session without finishing nor closing HDR Calibration tool, I went back to the main menu and started again, and then bam, the max luminance went up to 2780 nits -20 nits less than max, 2800-. So yeah, kinda unreliable.

You're not getting 2000 nits on an OLED, that should instantly tell you it's broken.

You're best off finding a few reviews, seeing what brightness they test, working out the average and setting it at that value.
 
This conversation reminds me, I have an HDR capable television (Samsung QN85B) and have simply never used it for HDR gaming. I have watched several 4K HDR movies and the difference is notable but not overwhelming.

May have to drag the PC into the media room and try out the rumored RTX game HDR mode when it arrives.
 
This conversation reminds me, I have an HDR capable television (Samsung QN85B) and have simply never used it for HDR gaming. I have watched several 4K HDR movies and the difference is notable but not overwhelming.

May have to drag the PC into the media room and try out the rumored RTX game HDR mode when it arrives.
the RTX HDR tool is out and DF made a great video about it. The video has been shared in the DF 2024 thread, but it's an excellent video to have in this discussion as it compares what HDR does vs SDR in real time and that's awesome, as it's hard to explain with words. It even looks great on my basic HDR 500 monitor, so I share it here too.

 
In the Nvidia control panel does the Digital Vibrance slider effect HDR output?
 
Last edited by a moderator:
It's a shame Auto PBR can't be a thing.
I think that would help older titles more than Auto HDR.
That would be interesting but surely a more complicated task. An artists knows what a surface is supposed to represent and look like and thus create the right materials, apply the proper maps to make the materials correct and in combination with each other in a scene to convey a composition that works for the viewer. An AI must be able to interpret that scene and generate multiple maps for each material. It must be able to interpret Old games that didn't have PBR (let's say HL1) and create materials with multiple maps for each mesh. I am sure it will be doable at some point.

And scary. Because material artists/designers will soon be replaced by AI. I m seeing more and more examples of new worthless businessmen that have zero clue about art and pipelines, being insulting to artists and willing to replace them, because now they have the illusion that they have the opinionated expertise and "eye" of an artist, due to how fast AI is producing results for the most untalented lazy unskillful and ungrateful individuals that can simply command through prompts to get results fast.

There are already many greedy new companies popping like mushrooms, currently in the indie and mobile gaming industry, that try to rely almost entirely on AI. A friend of mine had a horrible experience yesterday at an interview because ofa moron CEO who interviewed her, who didn't even know what post-processing means,no idea about pipelines, challenging her as an Artist due to their use of AI. She didn't even understand why did they even invite her. The industry is going to get oversaturated with the use of AI, and a lot of artists are going to lose their jobs, especially with the state of the gaming industry now and companies wanting to increase margins.

AI is fascinating, but it is improving in exponential rates and it is uncobtrollable
 
I think you're overlooking the "auto" part of "auto pbr".
I believe RTX Remix Studio does manage to implement "automatic PBR" through some form of AI analysis of the underlying texture versus a final rastered result. However, if you're suggesting some "magic PBR-applying wand" in the driver, I'm not sure that would ever be feasible. Even with RTX Remix and their AI to guesstimate what PBR effects might be sensible, it's still going to get it wrong a lot.

Consider this: how would any "magic PBR" understand if a surface is truly supposed to be mirror-like, versus shiny like a nice semi-gloss finish on new latex interior paint, versus shiny like a nicely polished fine-grained stainless steel, versus somewhat matte finished like an eggshell, versus matte finish like the flat ceiling paint in your house, versus matte finish bordering on rough like fine sandpaper? Or something even crazier but also more lifelike, such as a rusted pipe which has splotches of very matte-finished crusty rust between segments of the eggshell-but-slightly-shinier surface typical of iron pipes? Or maybe a dirty stainless steel countertop, where a coarse-grained but still mostly polished stainless steel table has dirt and grime splattered pseudo-randomly across it?

Most older graphics engines achieved these sorts of effects by either layering textures (and not all of them RGB), newer engines were able to use those same multitexture methods with shaders on top, the most modern engines might do nearly all of it with shaders. How does the driver actually know what the single polygon with multiple layers of textures and shaders is really meant to look like? RTX Remix will make a guess, and that guess might be reasonably close for a measurable portion of the game. Still, it will always and ultimately depend on the human running the show to make the necessary tweaks and adjustments to actually get it right.

I think we forget how many tricks and hacks programmers have used throughout the decades to fake a good-looking surface, long before we had the compute power to actually properly emulate it.
 
followed the video step by step, downloaded the software and then alas I realized it's only meant for LG OLED TVs. It is good to have alternatives to Windows HDR Calibration, 'cos the numbers aren't very reliable.

Minutes ago I used Windows HDR calibration, when I started, the max luminance setting according to calibration was 2180 nits and I was thinkiing like it wasn't that bad, it was an improvement over the recommended 2700 something I had before.

In the same session without finishing nor closing HDR Calibration tool, I went back to the main menu and started again, and then bam, the max luminance went up to 2780 nits -20 nits less than max, 2800-. So yeah, kinda unreliable.

Did you turn off dynamic tone mapping on the display?

Other says there is a glitch so you have to turn the monitor off and then on before running the calibration tool.
 
Did you turn off dynamic tone mapping on the display?

Other says there is a glitch so you have to turn the monitor off and then on before running the calibration tool.
I didn't. Afaik there isn't a setting like that on my display's settings menu. Also when I launch the app, a warning message appears telling me that I don't have a LG display, which is true xD
 
Lol the RTXHDR results look horrid from what I've seen, either almost the same as Auto-HDR to at times literally "unplayable" and not in a nonsense internet gamer sort of way. You're, you know, supposed to actually see things in a game generally.
AI is fascinating, but it is improving in exponential rates and it is uncobtrollable
Well fortunately it just appears exponential, what we're seeing is decades of research finally producing any consumer facing products at all, and then hype weirdos claiming it's "exponential". But that would violate some principles of the universe, such as "the easier something is the more likely you are to accomplish it first". It doesn't get easier, it only ever gets harder on average.
 
It's very good, but it's an interpretation of HDR, a best guess and comparable in many ways to auto colourisation of black and white video.
Rather than rely on these fake methods on modern games, developers should ensure HDR in games works as intended. Perhaps this is an education problem, HDR workflows and practices have been successful implemented in broadcast editing and colour grading systems for years. It's a known quantity.
 
Back
Top