HDR settings uniformity is much needed

Was the preferred look of RTX HDR over AutoHDR actually just a bug?
From latest drivers Fixed Bugs section
RTX HDR uses saturation and tone curve that matches Gamma 2.0 instead of 2.2
 
With RTX HDR the color saturation can be maxed out without colors clipping, and the algorithm will keep up just fine. You also can crank up the brightness without washing out any colors, the blacks remain perfect.

Also RTX Vibrance is for the SDR content that you want to achieve HDR look to it, without actually using HDR.



Ok see now people enjoying it makes sense to me, didn't know there were user controls. If you've got user controls then people tend to love that. Do people ruin the intended experience for themselves way more often than not? Sure of course, but a lot of people like having a sense of control initially, or just altogether (heck maybe the game or the intent sucks, but playing around is fun). Is auto upconverting SDR to HDR actually the equivalent of having a well mastered HDR experience? No of course not it's the equivalent of exactly what it is, an auto one that just follows some set of curves, but heck if you want to play with those curves go for it.
 

according to this article my old 165Hz monitor wouldn't be true HDR if we take into account DisplayHDR 1.2 requirements. Static is more than fine but DCI-P3 coverage is poor.

 
Last edited:
AnandTech also writes about the new VESA CERTIFIED DisplayHDR™ requirements here.

DisplayHDR 1.2 improves upon pretty much all the test patterns and requirements compared to the older standard.

That they are fine with a Delta-TP < 8 or even Delta-TP < 6 is mind boggling though ...
 
image.png


yet another worthless dev who can't understand HDR basics. Stop making up terminology ffs. Clowns.
 
GoW was my last straw, I’ve fully given up on devs understanding HDR. Good luck to all.

My recommendation is test for yourself and if a game doesn’t have adequate controls then use RTX hdr.
 
GoW was my last straw, I’ve fully given up on devs understanding HDR. Good luck to all.

My recommendation is test for yourself and if a game doesn’t have adequate controls then use RTX hdr.
Whats wrong with it? I find most people’s frustration with HDR is linked to their poor quality monitors, I’ve never had an issue with game HDR.
 
GOW 2018 default HDR settings are awful, when you correctly calibrate it the HDR image quality improves a lot. Thing is the calibration menu is not very good.
 
Lol, you're talking to the wrong guy about having poor quality displays. The difference is he probably has way way way higher standards than you do.
I'm certainly not using Calman gear to compare, however many people online (professional AV reviewers) have and most people have no problems with 99% of HDR implementations. Notable exceptions were the early Starfield implementation and the early RDR2 implementation.

GOW 2018 default HDR settings are awful, when you correctly calibrate it the HDR image quality improves a lot. Thing is the calibration menu is not very good.
I would think all default HDR settings would be awful unless its reading display parameters from Windows HDR calibration (on PC I think exactly zero games do this). Since we don't have 10k nit displays (and relying on the tonemapper is fine but not optimal if we can control it, we can't for movies unless we use DV but we can for games!) we kinda need per-TV calibration. Even on Xbox (PS maybe too) very few games read from the console's calibration data.
 
Back
Top