HDR settings uniformity is much needed

Was the preferred look of RTX HDR over AutoHDR actually just a bug?
From latest drivers Fixed Bugs section
RTX HDR uses saturation and tone curve that matches Gamma 2.0 instead of 2.2
 
With RTX HDR the color saturation can be maxed out without colors clipping, and the algorithm will keep up just fine. You also can crank up the brightness without washing out any colors, the blacks remain perfect.

Also RTX Vibrance is for the SDR content that you want to achieve HDR look to it, without actually using HDR.



Ok see now people enjoying it makes sense to me, didn't know there were user controls. If you've got user controls then people tend to love that. Do people ruin the intended experience for themselves way more often than not? Sure of course, but a lot of people like having a sense of control initially, or just altogether (heck maybe the game or the intent sucks, but playing around is fun). Is auto upconverting SDR to HDR actually the equivalent of having a well mastered HDR experience? No of course not it's the equivalent of exactly what it is, an auto one that just follows some set of curves, but heck if you want to play with those curves go for it.
 

according to this article my old 165Hz monitor wouldn't be true HDR if we take into account DisplayHDR 1.2 requirements. Static is more than fine but DCI-P3 coverage is poor.

 
Last edited:
AnandTech also writes about the new VESA CERTIFIED DisplayHDR™ requirements here.

DisplayHDR 1.2 improves upon pretty much all the test patterns and requirements compared to the older standard.

That they are fine with a Delta-TP < 8 or even Delta-TP < 6 is mind boggling though ...
 
image.png


yet another worthless dev who can't understand HDR basics. Stop making up terminology ffs. Clowns.
 
Back
Top