Display contrast - when does it matter?

Scott_Arm

Legend
I've got two calibrated monitors side by side. One is a VA around 2500:1 (measured) and the other is an IPS around 1000:1. The colours look pretty much identical to my eye. When I bought the VA (Samsung Odyssey G7), I was really interested in the black level and contrast because I was moving from a TN panel that had really grey looking blacks. Now that I have my work monitor calibrated and setup beside them, I can display the same images side by side, or a black screen and honestly the black levels look very similar. I can see a difference, but it's small. Now I know OLEDs with "true blacks" are on a whole different level, but when does any contrast ratio start to really matter? According to rtings.com they say > 3000:1 is good, and steps of 500 are noticeable. Maybe my eyes are fucked, but the difference between 2500:1 and 1000:1 are very small.
 
To be honest i think contrast ratio's are pulled out of the manufacturers ass
Heres the specs of 2 monitors i picked at random on overclockers.co.uk

Monitor 1
Brightness (Typ.) :
350cd/m2
Brightness (HDR, Peak) :
400 cd/m2
Contrast Ratio :
1000:1

Monitor 2
Brightness :
400 cd/m2
Dynamic Contrast Ratio :
100,000,000:1
 
Last edited:
To be honest i think contrast ratio's are pulled out of the manufacturers ass
Heres the specs of 2 monitors i picked at random on overclockers.co.uk

Monitor 1
Brightness (Typ.) :
350cd/m2
Brightness (HDR, Peak) :
400 cd/m2
Contrast Ratio :
1000:1

Monitor 2
Brightness :
400 cd/m2
Dynamic Contrast Ratio :
100,000,000:1

Mine are contrast ratios that I measured with a colorimeter in displaycal and HCFR. So they're real. Just side-by-side I was really expecting the black level on the IPS to look a lot more elevated compared to a VA with 2.5x the contrast.
 
Mine are contrast ratios that I measured with a colorimeter in displaycal and HCFR. So they're real. Just side-by-side I was really expecting the black level on the IPS to look a lot more elevated compared to a VA with 2.5x the contrast.
Let's make sure everyone in the thread understand the definition:

Wikipedia said:
The contrast ratio (CR) is a property of a display system, defined as the ratio of the luminance of the brightest shade (white) to that of the darkest shade (black) that the system is capable of producing.

Peak brightness is the numerator, and lowest brightness (peak darkness? heh) is the denominator. Reducing an already small denominator makes a pretty significant difference in the math, even if perceptually the change is minor. Examples:

400 lumen measured peak / 0.40 lumen measured base = 1000:1 contrast ratio
400 lumen measured peak / 0.16 lumen measured base = 2500:1 contrast ratio

So a 0.34 lumen difference jacks the contrast ratio by 1500... Are you really going to notice that 1/3rd of a lumen difference in dark scenes? Maybe you will, at least just a tiny bit. Is it really a stark difference though? Probably not.
 
Let's make sure everyone in the thread understand the definition:



Peak brightness is the numerator, and lowest brightness (peak darkness? heh) is the denominator. Reducing an already small denominator makes a pretty significant difference in the math, even if perceptually the change is minor. Examples:

400 lumen measured peak / 0.40 lumen measured base = 1000:1 contrast ratio
400 lumen measured peak / 0.16 lumen measured base = 2500:1 contrast ratio

So a 0.34 lumen difference jacks the contrast ratio by 1500... Are you really going to notice that 1/3rd of a lumen difference in dark scenes? Maybe you will, at least just a tiny bit. Is it really a stark difference though? Probably not.

True. I have the brightness equalized on the two monitors at 120 nits. My room is fairly dark at night.

I guess my expectation was that seeing them side by side would be noticeable. I wonder if it's a case of people seeing the numbers, even if they're measuring them, and rationalizing the difference. Even rtings.com has increments of 500 being a meaningful difference for contrast. I just don't see it ... literally.
 
Well, RTings isn't a great standard for qualitative analysis in my opinion. It's crowdsourced opinion and sometimes it's not well informed.

Anyway, the math still shows why contrast values could be significantly different and yet not perceptively different.
 
heres a question if oled's emit zero light when displaying black then you can claim the contrast ration is more than a billion because zero lumens * a billion = zero

So a 0.34 lumen difference jacks the contrast ratio by 1500.
you sure 0.16 + 0.34 = 0.40 ;)
 
I've got two calibrated monitors side by side. One is a VA around 2500:1 (measured) and the other is an IPS around 1000:1. The colours look pretty much identical to my eye. When I bought the VA (Samsung Odyssey G7), I was really interested in the black level and contrast because I was moving from a TN panel that had really grey looking blacks. Now that I have my work monitor calibrated and setup beside them, I can display the same images side by side, or a black screen and honestly the black levels look very similar. I can see a difference, but it's small. Now I know OLEDs with "true blacks" are on a whole different level, but when does any contrast ratio start to really matter? According to rtings.com they say > 3000:1 is good, and steps of 500 are noticeable. Maybe my eyes are fucked, but the difference between 2500:1 and 1000:1 are very small.
does your IPS screen have FALD? I had a standard IPS display with 1000:1 contrast ratio and compared to my Dell monitor which is VA, blacks looked a bit gray in the IPS screen. Also some time ago I had a 240Hz VA monitor from Samsung that had a great contrast -it wasn't HDR compatible but still- and compared it side by side with a cheap native 4K TN, the difference was staggering. (alas I had a photo, Windows 10 or 11 with their blue thing logo and a black background, side by side are excellent for a quick comparison between the two provided you turn both on at the same time).

If both displays you mention are HDR compatible, you are going to see a pretty big difference in a video like this, specially if one of the screens is FALD + VA. You should observe a lot more a white light leaks highlights in the panel with the worst contrast.


This is one of the videos I used to test the IQ of my TV when I purchased it. The better the contrast, the most realistic the explosions will -they won't create a lit area with a halo surrounding them in HDR mode-.

A tonemapped capture of how it looks on my TV. It has FALD and so on but it should look even better on an OLED TV where the white light from the fireworks doesn't spread too much and doesn't create a glow or halo.

TjsJcSa.png
 
Last edited:
@Cyan The IPS is just an office monitor. 60Hz, no HDR. The Samsung VA has local dimming and "HDR" but I turned it off because there are only about 8 dimming zones.
 
Are both matte?

Kaye the matte layer make the contrast ratio difference less noticeable?

And how about in a pith black room?
 
Are both matte?

Kaye the matte layer make the contrast ratio difference less noticeable?

And how about in a pith black room?

They're both matte. Not sure if a pitch black room would make a difference. Maybe. But I'd never use it that way. My room is pretty dark as is.
 
@Cyan The IPS is just an office monitor. 60Hz, no HDR. The Samsung VA has local dimming and "HDR" but I turned it off because there are only about 8 dimming zones.
that's more dimming zones than my current VA monitor, which still has quite a good contrast tbh. That HDR video could give you an idea, the least glow surrounding the smoke clouds the better.

Also if you can place both of them side by side in a dark room with the Windows 10 or Windows 11 official logo, you can easily spot the contrast differences, if there is any.
 
that's more dimming zones than my current VA monitor, which still has quite a good contrast tbh. That HDR video could give you an idea, the least glow surrounding the smoke clouds the better.

Also if you can place both of them side by side in a dark room with the Windows 10 or Windows 11 official logo, you can easily spot the contrast differences, if there is any.

Not sure if the local dimming works with SDR. I'll have to check it out. There are 8 dimming zones, now that I've looked it up, and since it's edge lit the zones are vertical bands. So the whole zone gets brighter or darker depending on the average brightness of the zone, I suppose. Will probably look awful in dark scenes, but maybe not.
 
Not sure if the local dimming works with SDR. I'll have to check it out. There are 8 dimming zones, now that I've looked it up, and since it's edge lit the zones are vertical bands. So the whole zone gets brighter or darker depending on the average brightness of the zone, I suppose. Will probably look awful in dark scenes, but maybe not.
my monitor doesn't have dimming of any kind and the contrast is actually good.

Native Contrast

2,953 : 1

Contrast With Local Dimming

N/A

The Dell S3220DGF has a very good contrast ratio. Thanks to its VA panel, blacks look deep in a dark room. The contrast ratio can vary between individual units.
 
The difference will be small but still notable. Perceived color saturation should be better on the VA panel due to having more contrast. Things such as starfields and low apl content will have more depth to the image.

What will impact this within lcd tech is light bleed and panel uniformity so if the IPS panel is more uniform, while the VA has lightbleed issues, the perceived delta drops quickly as you quickly lose all depth in those areas.
 
Back
Top