Curved monitors with high Hz.

another reason not to get a 4k high refresh rate monitor.

This guide helps you set your monitor to the 10bit color depth typical of HDR.

Configuring Color Depth
Configuring your monitor’s color depth is crucial to being able to display high dynamic range content, but even on an SDR monitor, proper configuration will allow your display to present rich, well-blended colors. Here’s how to get the most out of it.
  1. If you’re an Nvidia user, return to the Nvidia Control Panel and navigate back to the same screen where we adjusted our resolution and refresh rate
  2. Scroll to the bottom of the page
  3. Under “Apply the following settings,” select the radio button for “Use NVIDIA Color Settings.”
  4. Below that you will find drop-down menus for Desktop Color Depth, Output Color Depth, Output Color Format, and Output Dynamic Range. Start by setting all of these to their highest value, choosing “Full” under the Output Dynamic Range menu and “RGB” for Output Color Format.
  5. Click apply to save these settings.

NCP-Screen.jpg



If you’re running a 4K, 144Hz panel, you may notice that the Output Color Depth value has lowered. If this occurs, set it back to its highest setting and then change Output Color Format to YCbCr444 (this will change Output Dynamic Range to “Limited”). Click apply again. If the color depth continues to change, lower Output Color Format to YCbCr422.

At this point, you will likely notice an ugly, off-color halo appear around still images and text, so I recommend only using it for games and video. If that still doesn’t work, you will need to either lower the refresh rate or drop to an 8-bit Output Color Depth.
  • Color Depth Shortcut: Right-click the desktop -> Nvidia Control Panel -> Display -> Apply These Settings -> Use NVIDIA Color Settings -> Color Drop-down Menus
If you’re an AMD user, this step is a bit simpler for you. Right-click on the desktop and select “AMD Radeon Settings.” Select the “Display” tab in the window that opens. From there, you will see boxes for “Color Depth” and “Pixel Format.” The first represents the Output Color Depth we found in Nvidia’s Control Panel. Pixel Format represents chroma subsampling (analogous to Nvidia’s Output Color Format). Set both of these to their highest settings.
  • Color Depth Shortcut: Right-click the desktop -> AMD Radeon Settings -> Display -> Color Depth and Pixel Format

https://www.ign.com/articles/2019/09/16/monitor-calibration-how-to-calibrate-monitor
 
With all these HDR postings, I decided to boot into window, must have been a recent update that fixed the washed out color for SDR. Some of those Youtube videos look awesome, just don't drag them to an SDR monitor.
 
Quick monitor question, are 6bit panels still a thing?
I dont think so, if they were common the image would be a colour banding galore.

10 bits per color (i.e. HDR10) gives you 1024 gradients of each colour when using full RGB.
But YCBCr uses limited RGB giving approximately 650 gradients per color.
8 bits per color with full RGB gives you 256 gradients per color.

If you reduce the number of bits per color enough, the difference between the next gradient can be seen.
For example, the sky in a clear day in a game is displayed ranging from light to dark blue.
Without enough shades of blue you will see the color change as bands across the sky.
This can be tested by changing from 32bit color to 16bit or 8bit.

256 shades per color is very good for SDR but when used for HDR you may see a lot of banding.

So instead of having 6bit panels, what about 12 bit panels? :mrgreen:
If you cant see banding with HDR 10 bit, HDR 12 bit isn't going to provide any benefit.
12 bits won't be necessary until we get HDR displays with extreme brightness (10000 nits?) and 12bit HDR media becomes a thing.

HDR10 is 10 bits per colour, so setting it to use 12bits per color wont make any difference.
 
Dont you mean multiples of 8ms ?
If your only getting 30fps thats 1 refresh between each frame at 60hz and 2 refreshes 2x8ms at 120hz surely
If you drop single frame from 30fps it will be visible for 8ms, instead of 16ms like on 60hz monitor. (Frame takes 34ms to create and is viewed one refresh late.)
what additional stable frame rates ?
Framerates dividable with the monitor refresh rates.

for 60hz
60, 30, 20, 15, 10 ...

120hz
120, 60, 40, 30, 24, 20 ,~17 ...

240hz
240, 120. 80, 60, 48, 40, ~34, 30, ~26, 24, ~22 ..
 
Last edited:
Quick monitor question, are 6bit panels still a thing?
Yes they are. Very much so, although obviously never advertised as such. They use dithering (read up on 6-bit + 2-bit FRC). Par for the course for TN-panels but are sometimes utilised for other display technologies as well (yes, including some IPS).
How this works out visually depends on implementation, use case and, of course, the individual.

I wouldn’t buy such a monitor without having the opportunity to actually see it in operation, preferably for my own purposes, first.
 
I had a 6bit panel years ago (yes you could see banding espescially on sky's)
I had hoped they were a thing of the past.
 
at 120hz surely
Ok I have a complaint
Whats wrong with you people dont you know it's internet law that if you reply to a post containing the word "surely" you have to end your post with "and dont call me surely"
just like if I post "I never expected the spanish inquisition" you have to reply with "nobody expects the spanish inquisition"
 
Wise tip:
Keep HDR off as long as possible if no supporting material is used. I had HDR enabled after installing Wolfenstein Younglood (gamepass PC) and all it did was smashing the blacks and whites.

Turn off HDR and you can play the game as it is meant to be - I like it more graphically wise than the Gears 5 even if the latter uses HDR. With the contrast of a VA panel, Bethesda's game looks tremendous anyways.
 
haven't tried it myself, but this is how HDR is supposed to look when you play Hellblade. The SDR image looks a bit blurred though.

8jzc6E9.gif
 
Apple made it again. Claim that you have the best monitor ever, and the most expensive too, but....

 
There is actually a rather long period of adjustment to a higher framerate monitor if you are used to lower frame rates. At first it ”merely” offers better smoothness of motion, but gradually you will adjust your game play to having better positional and more importantly movement vector information. You will make quicker turns since you are less likely to get momentarily disoriented, you will be able to snap aim (with mouse) since the entire system of latency, opponent movement vector and your precision in angular velocity will have improved across the board, and so on.
Once that adjustment has been done and integrated into your motor memory, going down in frame rate will be ... an unhappy experience. Different genres have different needs, obviously, but I really enjoy the sense of immediate control regardless.
oh man the day has come. Truer words have never been spoken. I got used to play Forza Horizon 4 at 110-140fps on average, and Wolfenstein Youngblood at 165fps -the max allowed by my monitor-.

So yesterday I locked the framerate at 60fps. And the game seemed choppy, I couldn't play it. I could see some transitions in between frames, which is my issue with 30fps. Not as pronounced of course, but I don't see myself going back to 60fps. The 60fps dream ends here for me. It was good while it lasted, and it was the dream of people who owned a console but never realised it. I did.
 
CALIBRATION AND HOW TO USE ICC FILES WITH THE CALIBRATION ALREADY DONE

There are professional pages which write monitor reviews and calibrate them to offer their full potential. Although each monitor of the same model can be relatively different from another, they are usually quite uniform.

In the case of one of the models mentioned in this thread the DELL S3220DGF, I found a website with a calibration ICC file and their calibration settings.

Calibration must be done with the monitor in SDR format - at least in this case - and then with that base activate HDR.

The page scores the image quality post-calibration (performed by them), with these settings.

Luminance settings -Brightness-: 35

Contrast settings -Contrast- : 70

RGB controls (Gain?): R 93, G 94, Blue 98.


And that monitor model would be already calibrated. The other values would be as the monitor would look in terms of color temperature and so on.

cxeDHsk.png


You can download calibration ICC file for Windows, which the creators have saved after calibrating the monitor, and apply it. This is the file:

https://www.rtings.com/images/reviews/monitor/dell/s3220dgf/s3220dgf-rtings-icc-profile.icm

https://www.displayninja.com/how-to-install-an-icc-profile-on-windows-10/ Here is how to apply an ICC with calibration data in Windows, which is very easy. It's worth it.
 
TEST TO CHECK THE ACTUAL BRIGHTNESS IN NITS OF YOUR HDR SCREEN

The test pattern will display a white box that increases in size from 4%, 9%, 25%, 49%, up to 100% of the screen. Each time a new size box is shown the brightness of the box increases from 100, 400, 1000, 2000, up to 4000 nits. Each combination of window size and brightness will stay on the screen for 5 seconds. Here is how to tell when your display starts failing the test. Every 5 seconds the brightness of the window increases and stays at that brightness for the entire 5 seconds. Displays fail the test in two ways:

Failure #1: At some point you will notice that when transitioning to a new brightness that the screen does not get much brighter than last time, or does not change brightness at all. This means the panel has reached maximum brightness and cannot go further.

Failure #2: At some point (especially when the window sizes are bigger) you will see that when transitioning to a new brightness the screen momentarily gets brighter and then it drops down in brightness within less than a second. This happen if your TV has an Automatic Brightness Limiter (ABL), which restricts the maximum brightness in certain situations to either save on electricity cost or to protect the panel.


My screen breaks after 1000 nits.
 
Played Blair Witch a little and I gotta say that with HDR on the game looks TOTALLY different than in SDR mode. I am not gonna say this is the best HDR in a game I've seen, for sure. But the one that makes the most difference with HDR on and off...it is. It is like an entirely different game.
 
Back
Top