*Console Display Calibration Issues

Jogi

Newcomer
What the heck Xbox one is doing with rgb range? I mean If I run the HDTV calibration from X1 and set rgb range _limited_ and set TV to _full_ range I get all colors grayscale from 0 to 255. Black is black and white is white, and not washed out at ends. And I see that "closed eye" with gray background just fine (which is not what xbox instructions want me to see.) Then if I play DVD movie picture is all washed out, and same for games.
If X1 send limited range RGB there should not be "black" color at all, because gray level 16 is blackest value that is in that signal. Somehow x1 shows full rgb range testimage in limited RGB mode (not scaled to limited range) and TV shows it fine, but when games or videos kick in their output range is limited in limited mode. Other odd thing is that x1 in full range mode that picture must be scaled because that gray background is not visible anymore..

Normal setting is to match x1 and TV rgb range ( limited with limited, full with full) but then you lose that "closed eye" gray background which surely such mean that something from grayscale is cut off / crushed (gray background comes pure black) and I'm not sure if this also cuts off some grays from games??


And does games output full range or limited.. scaling either way can produce uneven color gradients?
 
What games did you test? Some are known to use an image 'enhancer' built into the scaling chip, where others don't.
 
All games look bad if mismatching rgb range settings..but point was does x1 cut off rgb level when using right settings.. and are games using limited or full range rgb internally (so is there even point to use full range settings)

I have noticed sub-fullhd scaling sharpening but it is almost unnotable even with my 64" plasma..I was expecting it be much worse.
 
All games look bad if mismatching rgb range settings..but point was does x1 cut off rgb level when using right settings.. and are games using limited or full range rgb internally (so is there even point to use full range settings)

I have noticed sub-fullhd scaling sharpening but it is almost unnotable even with my 64" plasma..I was expecting it be much worse.

It's not just sharpening - there's a belief that the image scaler is somehow responsible for messing up the black levels.

There's a GAF thread on the topic (unfortunately most of the images are broken :( ).
http://neogaf.com/forum/showthread.php?t=722069
 
I matched my tv and x1 at limited range. I was able to set the black level using the calibration, but white level was way off. I could turn my contrast up to 100 and I still couldn't get the sun symbol to disappear. So I just left my tv at the calibration settings I was already using.
 
It's not just sharpening - there's a belief that the image scaler is somehow responsible for messing up the black levels.

There's a GAF thread on the topic (unfortunately most of the images are broken :( ).
http://neogaf.com/forum/showthread.php?t=722069

I'm looking at that thread, and it's not particular "scientific." The guy's capture card could be garbage or bugged. We've seen the latest captures for Battlefield and COD from sources like DF, and there don't appear to be any issues with black crush on the retail copies (though there were pre-release). Not sure what's going on. I don't have Dead Rising, so I can't speak to that. I'll have to look at Killer Instinct when I get home. It looked good to me. I'll check again, and maybe play around with full rgb to see how that looks.
 
Works fine for me. I knew my TV is 36 bits -69 billion colours-, I set it to that colour depth and the calibration worked like a charm every single time.

Take into account that the calibration utility of the Xbox One "recommends" you to choose Standard RGB to match the usual video settings.

My only problem with the calibration utility is that my TV isn't full RGB compatible and I can't see the sun nor the closed eye no matter what if I choose Full RGB, but with Standard RGB calibration is a breeze.

Make sure you know the depth colour of your TV, just in case. The picture looks gorgeous for me -I would love savvy people here to be able to see and judge-. I calibrated all the picture style settings -each own with its personality-, but then most Philips TVs from 2013 have outstanding picture quality. :smile2:
 
How do the other colour depths affect black and white level? I didn't think that they would?

I can't find much point in using anything above 24 bit. Could be wrong. Not sure how to find out what my TV supports anyway.
 
How do the other colour depths affect black and white level? I didn't think that they would?

I can't find much point in using anything above 24 bit. Could be wrong. Not sure how to find out what my TV supports anyway.
Perhaps it has to do with the contrast, who knows...

I found the colour depth of my TV -4500 to 5000 series- in this image while I was looking for a new Philips TV.

3bafaba713838a91b2a09f3e09a2fb91_XL.jpg


I don't know if there is a handshake between your TV and the HDMI interface when selecting the correct colour depth or whether it detects when your colour depth has been wrongly selected or not.

When it comes to audio it certainly finds out when your receiver doesn't support a certain audio input. I can't select 5.1 Uncompressed, 7.1 Uncompressed nor DTS. It might have something to do with the console reading the EDID of my TV, and yes, the sound is rather average. It's a stereo TV and that's it.

Higher specced TVs from Philips have a subwoofer though. Not much, but it helps.
 
I can switch to 36 bit. It seems to look normal. Nothing looks wrong. Just not sure there's much point in bothering.

I did play with the calibration tool again. I put my xbox one on a different hdmi input on my tv so I wouldn't mess up the settings I had. This time I followed the instructions carefully. I still calibrated the tv to limited range, and I still have an issue with contrast where I cannot get the sun symbol to disappear. The thing is, if I look at the white gradients bar, I get crushing whites well before the sun disappears. So I scaled down the contrast a couple points from max to see each gradient again. In the bar for black gradients, the last two gradients look pretty much the same, so I'm crushing blacks a little bit I think. The last test for the colour/tint adjustment was way off on my old settings. Overall it looks very good now. Not sure how well it would hold up against a proper calibration test, but it looks good to my eyes. I tried out Killer Instinct and I do not see black crush like the screenshots in that gaf thread. Battlefield looks pretty nice.

Tomorrow I'll mess around with full range. My tv does have an option for full range. I do watch a fair amount of movies, so I don't want them to look shitty. If I set my xbox to full range, and my tv to full range, will movies encoded at limited range look washed out? I really don't want grey blacks and grey whites.
 
I matched my tv and x1 at limited range. I was able to set the black level using the calibration, but white level was way off. I could turn my contrast up to 100 and I still couldn't get the sun symbol to disappear. So I just left my tv at the calibration settings I was already using.
Not sure what the Xbox One calibration image looks like, but it sounds like it's a standard white clipping test pattern. The purpose is to tell the viewer when the TV starts to clip white detail. If the sun doesn't disappear, that's fine... that just means you're not clipping white (which is good).

The issue with clipping patterns is people assume that there's a right and wrong Contrast setting, and because many TVs don't clip white detail even if you set the Contrast to 100, people think something is wrong. As long as you're not clipping white, you can set it to whatever you want.

Setting the Contrast is a bit different on Plasmas vs LCDs because plasmas don't have a backlight control whereas LCDs do, and both the backlight and contrast controls have similar purposes... to raise or lower the light output of your display (Contrast technically sets the white level, but that is essentially the peak brightness of your display). If you have a Plasma, set the Contrast as high as you can to avoid clipping, then set it to a comfortable level avoiding eye fatigue. Just because you can set it to 100 doesn't mean that your eyes can handle that brightness. On an LCD, set the Contrast as high as you can to avoid clipping, then use the backlight to set your desired light output, again avoiding eye fatigue.
 
Last edited by a moderator:
Not sure what the Xbox One calibration image looks like, but it sounds like it's a standard white clipping test pattern. The purpose is to tell the viewer when the TV starts to clip white detail. If the sun doesn't disappear, that's fine... that just means you're not clipping white (which is good).

The issue with clipping patterns is people assume that there's a right and wrong Contrast setting, and because many TVs don't clip white detail even if you set the Contrast to 100, people think something is wrong. As long as you're not clipping white, you can set it to whatever you want.

Setting the Contrast is a bit different on Plasmas vs LCDs because plasmas don't have a backlight control whereas LCDs do, and both the backlight and contrast controls have similar purposes... to raise or lower the light output of your display (Contrast technically sets the white level, but that is essentially the peak brightness of your display). If you have a Plasma, set the Contrast as high as you can to avoid clipping, then set it to a comfortable level avoiding eye fatigue. Just because you can set it to 100 doesn't mean that your eyes can handle that brightness. On an LCD, set the Contrast as high as you can to avoid clipping, then use the backlight to set your desired light output, again avoiding eye fatigue.
Very interesting, thanks for sharing. My Video Contrast sweet point where the sun is barely visible AND the vertical stripes between the marks -I am talking about the Xbox One built-in calibration tool- are fully discernable is at 95 -at 98 the sun just disappears-.

For Brightness the exact point where the closed eye becomes invisible is at 54.

Jogi, my advice would be to follow the calibration tool instructions carefully.

Additionally, the guys who created one of the most renowned calibration tools in the world (Spears and Munsil) have a webpage with some essential calibration tips.

Here is an extract (the full article can be found in the link below) with some particularly interesting points.

http://www.spearsandmunsil.com/?portfolio=getting-started-with-the-high-definition-benchmark-2

Preparation

Perform the calibration under the same lighting conditions you generally use to watch quality material like movies. In general, video looks best when the room is as dark as possible, but it’s most important to duplicate the real lighting conditions you will be watching under.If there are windows in the room that let in sunlight and you tend to watch movies at night, then the calibration should also be performed at night.

When you’re ready to perform the calibration, first turn on your display and Blu-ray player and let the system warm up for 15 minutes or so, preferably playing some real material.

Video mode settings

Once the system has warmed up, it’s time to start adjusting. Most modern displays have an overall “Picture Mode” setting, and several advanced picture settings. It’s important to get these set correctly first.

Picture Mode

There are no standards for what these modes do, and the names vary considerably.Generally if there is a “Movie” or “Cinema” setting, that is the one to use. On some displays, the “Movie” or “Cinema” mode is preset and locks out all the other picture controls. In that case, or if there is no “Movie” or “Cinema” mode, try using “Custom,” “Normal” or “Standard.”

Avoid anything that sound like it makes the picture extra-bold, like “Vivid” or “Dynamic”, or modes that sound like they’re optimized for a single purpose like “Sports” or “Game
”.


Advanced Video Modes

For the most part,we recommend turning special picture “enhancement” modes off. They are usuallyoptimized for low-quality video and bright environments, and actually will harm the picture quality of high-quality video like Blu-ray Discs when watching in a low-light environment.

Set these to Off or 0 (write down the original setting first):

Noise Reduction/Noise Filter
• Black Tone
• Dynamic Contrast
• Shadow Detail
• Flesh Tone
• Edge Enhancement
• Black Corrector
• Contrast Enhancer
• Live Color
• Smart Dimming
• Color Enhancement
• Ambient Light Sensor
• Motion Plus/Cinema Motion/Smooth Motion/Real Cinema
• Auto Iris


If you encounter a mode with a similar name to one of the above settings, or a mode that is described in the owner’s manual as a video enhancement or improvement, it’s best to turn it off.

Special cases:

Color Temperature/Color Tone.Setting this perfectly requires test equipment, but usually if there is a “Cinema”, or “Neutral” option, that is often close to correct and is a good choice. If that is not offered,“Computer” or “Normal” are other good choices. “Cool” is not generally a good choice, as commonly it sacrifices color accuracy to get higher light output. It is worthwhile putting up the 11-Step Crossed Gray Scale pattern and trying the different color temperature settings.Any setting that makes any of the gray steps seem to have a colored tint are probably badchoices. It’s not uncommon to have more than one setting that looks essentially white.Unless you have the test equipment necessary to check color temperature, just select one that looks as neutral as possible.

Backlight. If this setting is offered, a good starting point is to turn it to the middle value. Later on, when you are performing the rough Contrast adjustment (further down in this guide), if the screen seems uncomfortably bright when viewing the Contrast pattern on the Spears and Munsil HD Benchmark, 2nd Edition, turn the backlight down until the light output is comfortable to view. Then check it again when you are performing the Brightness adjustment (further down in this guide). If the screen seems notably dim while viewing the Brightness pattern, such that the right-hand bars on the Brightness pattern are very hard to see once the Brightness control is set correctly, then turn the backlight up.

Black Level or HDMI Black Level.This should be set to “Low”, “Video”, or “Standard”. It should not be set to “Normal”, “PC”, or “Extended”.
______________________________________________________________-
Note how they are recommending you to set the TV to Standard RGB, which is the standard for video.

My TV doesn't even support Full RGB -the option is nowhere to be seen- despite being a 2013 set and it shines at Standard RGB, so I can't complain.
 
Y03NFDP.jpg

In the picture above you see that the step 20 , which is close to the step 16 -absolute black on Limited RGB-, should be black using Standard RGB, but it is gray instead. :rolleyes:

Sorry but I dont trust someone who cant tell the difference between black and yellow

ps: I see your point you think limited is better because your tv cant display 255 shades so you need to set it to only use shades 16-235

ps:
@Globalisateur are your sure about that lego game running at 1920x1200 on the ps4 that would be a strange decision as I imagine very few tv's support that resolution ?
 
If you are going to use a TV to play, Limited range is undoubtedly the best choice. If you are going to use a computer monitor it is a matter of preference. Still.. full Range sucks quite a bit.
Limited range is undoubtedly not the best choice in all situations. If your display supports full range, TV or monitor, full is the better choice in terms of PQ. In most cases, if everything in the chain is set appropriately and matches, there is little to no difference between RGB Full or Limited.

Microsoft recommend on their Xbox.com site to use Standard Range, ‘cos for a TV it is best, and you will never have problems with that range.

https://support.xbox.com/en-US//xbox-one/system/adjust-display-settings
They recommend Standard range for compatibility purposes, not for optimal PQ. It is true that the video standard is (YCbCr) 16-235, but that has little to nothing to do with games and what these consoles output.
Limited range works on all televisions and basically almost all the video material you can see is created with Standard RGB in mind, usually the original format in which that video material was created. Moreover, many many TVS made in 2013 and 2014 don’t even support full Range.
Keyword is video material, not videogames.

And your statement is wrong, many TVs made in the last 3-5 years do support full range. Most of the major flatpanel makers do: Samsung, LG, Panasonic, Toshiba and Sony. 4 of those brands are in the top 5 in flat panel sales.

Limited range or Standard RGB and Full Range are two existing ways of defining the value of Black and White. The Full Range is set to 0 to 255. That is, counting 0 as the first step, there are 256 steps from black to white. :)

In contrast, Standard RGB –or limited RGB- features 37 less steps compared to full RGB , and absolute black to absolute white ranges from the values 16 to 235.

In other words, with Standard RGB the value of black is 16 , which is the first step. Absolute White Range for Standard RGB is placed in the step 235.

So why choose a limited range TV and what problems may arise if you don’t? First, basically movies , videos and all the material you see on DVD or Blu- Ray format is encoded in YCbCr and Limited range ...

Furthermore, the problem of choosing Full range on a TV that does not accept full RGB is that you would see values in typical "black" that should be gray instead. (eg the value 19-20-21 are almost black using Limited Range, where black starts at 16, BUT 19-20-21-etc steps are grey if you use Full Range ... etc)

This is an example of a full range image, represented step by step :

blacktest.png


In the picture above you see that the step 20 , which is close to the step 16 -absolute black on Limited RGB-, should be black using Standard RGB, but it is gray instead. :rolleyes:

BUT if you display this image on a Limited Range TV –steps 16 to 235- you should hardly see steps 15 and under. If that happens no worries, it’s not your fault, you are viewing a Full range picture on a Standard RGB/Limited Range display.

If in doubt always use Limited range and the image will look good to everyone regardless of the TV.

That’s why Full Range sucks so much, despite DF treating it as if it was the Holy Grail, which is not the way to go.
Yes, RGB full range sucks, if and only if your display doesn't support it. Otherwise it is the optimal setting for PCs/console games.

Blu-Ray players/video players are a different story since videos use a different standard. Fortunately for the PS3/PS4 and I'm sure the XB1, you can choose to output a different colorspace for Blu-Ray, where YCbCr is in most cases optimal. On the PS3/PS4, games always output RGB, therefore Full range is optimal.

But like I said above, there's generally little to no difference between full/limited if set properly so it's not the end of the world if your display only supports limited range.
 
Last edited by a moderator:
Full range or limited range don't look too much different... as long as the full display chain knows to use the right setting. A game outputting limited to the console hardware and the TV accepting limited will look good. Just as when everything is set to full. But if the console puts out limited and the TV expects full, or vice versa, it'll look bad, because you'll lose information. You can scale the information (so limited becomes full and vice versa), though, which will lead to loss of precision, though.
 
Doesn't limited range use less information to provide the color ranges, so that it would always lead to loss of precision (e.g. more noticeable banding)
 
Depends on the output device. I don't really notice much (if any) difference with the PS3/PS4 when I output limited or full range on my plasma or other flat panel TVs. I do use full range since my displays supports it, and it doesn't affect other devices sharing the HDMI port when I do. But yes, compressing the range can cause banding with certain devices, which is why RGB full is optimal if your display supports it.
 
Last edited by a moderator:
Doesn't limited range use less information to provide the color ranges, so that it would always lead to loss of precision (e.g. more noticeable banding)
Yes. Full range looks better if both the game and the TV set support it.
I'm assuming that devs are still thinking terms of TV inputs - 720p or 1080p.
720p native TV sets do not exist (except for some 720p CRT HDTVs were sold in US before LCD/Plasma become popular). 720p will always be upscaled. Nobody will see it pixel perfect without scaling. 1336x768 is the most common "HD-ready" TV resolution, but there's many other "HD-ready" resolutions as well. It's easiest just to assume that anything below 1080p will be scaled up or down (unless you are lucky).

800p looks better than 720p (it has 22% more pixels) when scaled to 1080p and 900p looks even better than that. Also when downscaled to 720p/768p (or any other "HD-ready" resolution) a bigger rendering resolution looks always better (more supersampling). So if you can afford to increase the resolution by any amount of pixels, that's always a good choice, no matter what TV sets your customers are using.
Ryse's 900p was 1080p cut down a bit. Titanfall's 792p is 720P plus a bit. I don't think any dev would look at 900p and then think a drop to 792p is okay as the loss in resolution is marginal, because they'll be comparing it to the 1080p benchmark.
This assumption doesn't always hold. Many developers are conservative in their early resolution settings, because they need to prioritize the game development. Running unoptimized graphics engine at 1080p at 15-20 fps isn't what your game play programmers want when they are tuning the game mechanics. For development, it's much better to run the game at lower resolution (720p) at target frame rate and increase the resolution when the code is optimized enough to reach a higher resolution. That's how we do it. Going from 720p to 900p (or even 1080p) at end of the project is a perfectly valid choice. Scaling down to 900p from 1080p is not likely if you are developing mainly on consoles. However if PC is your main development platform, and your console port doesn't perform as well as expected, I could see this way being popular choice as well. It's hard to draw any conclusions without insider knowledge (in-depth interviews of the developers themselves).
 
Back
Top