HDR settings uniformity is much needed

Was the preferred look of RTX HDR over AutoHDR actually just a bug?
From latest drivers Fixed Bugs section
RTX HDR uses saturation and tone curve that matches Gamma 2.0 instead of 2.2
 
With RTX HDR the color saturation can be maxed out without colors clipping, and the algorithm will keep up just fine. You also can crank up the brightness without washing out any colors, the blacks remain perfect.

Also RTX Vibrance is for the SDR content that you want to achieve HDR look to it, without actually using HDR.



Ok see now people enjoying it makes sense to me, didn't know there were user controls. If you've got user controls then people tend to love that. Do people ruin the intended experience for themselves way more often than not? Sure of course, but a lot of people like having a sense of control initially, or just altogether (heck maybe the game or the intent sucks, but playing around is fun). Is auto upconverting SDR to HDR actually the equivalent of having a well mastered HDR experience? No of course not it's the equivalent of exactly what it is, an auto one that just follows some set of curves, but heck if you want to play with those curves go for it.
 

according to this article my old 165Hz monitor wouldn't be true HDR if we take into account DisplayHDR 1.2 requirements. Static is more than fine but DCI-P3 coverage is poor.

 
Last edited:
AnandTech also writes about the new VESA CERTIFIED DisplayHDR™ requirements here.

DisplayHDR 1.2 improves upon pretty much all the test patterns and requirements compared to the older standard.

That they are fine with a Delta-TP < 8 or even Delta-TP < 6 is mind boggling though ...
 
image.png


yet another worthless dev who can't understand HDR basics. Stop making up terminology ffs. Clowns.
 
GoW was my last straw, I’ve fully given up on devs understanding HDR. Good luck to all.

My recommendation is test for yourself and if a game doesn’t have adequate controls then use RTX hdr.
 
GoW was my last straw, I’ve fully given up on devs understanding HDR. Good luck to all.

My recommendation is test for yourself and if a game doesn’t have adequate controls then use RTX hdr.
Whats wrong with it? I find most people’s frustration with HDR is linked to their poor quality monitors, I’ve never had an issue with game HDR.
 
Whats wrong with it? I find most people’s frustration with HDR is linked to their poor quality monitors, I’ve never had an issue with game HDR.

Lol, you're talking to the wrong guy about having poor quality displays. The difference is he probably has way way way higher standards than you do.
 
GOW 2018 default HDR settings are awful, when you correctly calibrate it the HDR image quality improves a lot. Thing is the calibration menu is not very good.
 
Lol, you're talking to the wrong guy about having poor quality displays. The difference is he probably has way way way higher standards than you do.
I'm certainly not using Calman gear to compare, however many people online (professional AV reviewers) have and most people have no problems with 99% of HDR implementations. Notable exceptions were the early Starfield implementation and the early RDR2 implementation.

GOW 2018 default HDR settings are awful, when you correctly calibrate it the HDR image quality improves a lot. Thing is the calibration menu is not very good.
I would think all default HDR settings would be awful unless its reading display parameters from Windows HDR calibration (on PC I think exactly zero games do this). Since we don't have 10k nit displays (and relying on the tonemapper is fine but not optimal if we can control it, we can't for movies unless we use DV but we can for games!) we kinda need per-TV calibration. Even on Xbox (PS maybe too) very few games read from the console's calibration data.
 
Whats wrong with it? I find most people’s frustration with HDR is linked to their poor quality monitors, I’ve never had an issue with game HDR.

There are two critical flaws which negatively impact the HDR in Ragnarok.

Firstly is the black crush. While they've done a great job of using 0 nits as the black floor, it's clear the grading display they used was poor with it's near black performance giving them heavily lifted blacks. Because you certainly would not see that level of black crush when grading and go "yep that's how it should look." Studios can easily hire respected calibrators in the area for display purchasing advice and calibration to avoid this simple issue. In scope of budgets, it's a rounding error so that's certainly not a viable excuse.

The second problem is the arbitrary HDR slider in game. The PQ EOTF is a set standard. There's no such thing as "oh i want my HDR to be brighter or more dull so I'll just use a slider" What's they've done is effectively make a EOTF slider to lift or lower the curve. What should be done is a nits slider which would allow a user to set number relative to their displays specular highlights, which tells the game to tone map to that and the display then outputs from there.

Both of these things aren't some wild findings or new ways to do things. It's the normal workflow for HDR grading but it's clearly missed again.
 
There are two critical flaws which negatively impact the HDR in Ragnarok.

Firstly is the black crush. While they've done a great job of using 0 nits as the black floor, it's clear the grading display they used was poor with it's near black performance giving them heavily lifted blacks. Because you certainly would not see that level of black crush when grading and go "yep that's how it should look." Studios can easily hire respected calibrators in the area for display purchasing advice and calibration to avoid this simple issue. In scope of budgets, it's a rounding error so that's certainly not a viable excuse.

The second problem is the arbitrary HDR slider in game. The PQ EOTF is a set standard. There's no such thing as "oh i want my HDR to be brighter or more dull so I'll just use a slider" What's they've done is effectively make a EOTF slider to lift or lower the curve. What should be done is a nits slider which would allow a user to set number relative to their displays specular highlights, which tells the game to tone map to that and the display then outputs from there.

Both of these things aren't some wild findings or new ways to do things. It's the normal workflow for HDR grading but it's clearly missed again.
Wow that’s really bad, I haven’t come across an implementation that bad in a while.
 
I don't really know much about HDR, but I feel like it's pretty obvious that with all the different implementations of various quality and various quality of monitors changing the equation that to have any sort of standard it's probably best to just use AI tone-mapped HDR.

At this point I basically just want something similar to what DLSS/FSR/XeSS are for upscaling but for HDR.. so essentially RTX HDR or Auto HDR. Something that developers can just plug in to their games and adjust to their preference. They could have different presets for various outputs which could work well depending on the classification of the display connected, as well as allowing the user to fine tune it to their liking.
 
It is also a bit confusing being a consumer as Windows has HDR settings but NVIDIA also have their RTX HDR with seperate settings and they seem to be mutually exclusive.
 
It is also a bit confusing being a consumer as Windows has HDR settings but NVIDIA also have their RTX HDR with seperate settings and they seem to be mutually exclusive.

System level calibration is basically HGIG but on pc the list of games that support it is pretty low to non existent.

Nvidia hdr is a post processing effect. It’s taking a sdr image and converting it to hdr in real-time thus the performance hit for using RTX hdr.

In an ideal world, game engines would directly reference the system level calibration and nothing more needs to be done. But when it’s left to the devs to implement, you get the mess it is.
 
System level calibration is basically HGIG but on pc the list of games that support it is pretty low to non existent.

Nvidia hdr is a post processing effect. It’s taking a sdr image and converting it to hdr in real-time thus the performance hit for using RTX hdr.

In an ideal world, game engines would directly reference the system level calibration and nothing more needs to be done. But when it’s left to the devs to implement, you get the mess it is.
The funny thing is you’d think the consoles were better at standardizing their system/TV-specific calibration but even there games insist on using their own sliders and tone mapping.

Maybe we just leave it to the TV? This is essentially what movies do, the filmmakers master with whatever brightness they see fit and pass that along to the TV to tone map, if its DV or HDR10+ then they add in metadata too. I never have to drag sliders for regular HDR10 movies haha.
 
Back
Top