In game brightness/calibration screens

SeeNoWeevil

Newcomer
I really can't get my head round this, why does every game insist on showing me a (most of the time awful) calibration image and brightness/gamma slider? Why do consoles not just stick a calibration image in OS settings and prompt users to use it and adjust their TV?? Adjusting video levels in software is stupid, get the screen right once and that's the end of it. Or am I wrongly assuming developers are sticking to the same standard for video output?

Another question, can the video levels setting (i.e Full/Limited) give correct results for one game, and incorrect for another?
 
I really can't get my head round this, why does every game insist on showing me a (most of the time awful) calibration image and brightness/gamma slider? Why do consoles not just stick a calibration image in OS settings and prompt users to use it and adjust their TV?? Adjusting video levels in software is stupid, get the screen right once and that's the end of it. Or am I wrongly assuming developers are sticking to the same standard for video output?
There is more than I use for the console, you might use different settings for game/video, 2D/3D and Day/Night combinations. Its not done with one setting.
Having some standard OS-Level calibration would be rather good idea though.
Another question, can the video levels setting (i.e Full/Limited) give correct results for one game, and incorrect for another?
Nope, if it does you are compensation for different (broken?) in-game settings. You need to adjust the console and tv to the same setting first (ideally full if both support it) and then play with other settings.
 
What Npl said.

My TV is calibrated to produce a good balanced image primarily for TV and movies sourced from a Mac mini running XBMC with capture card, PlayStation 3 and PlayStation 4, which output being recorded TV, movies ripped from Blu-ray, movies direct from Blu-ray/DVD, TV direct from Blu-ray/DVD and TV received terrestrially.

On top of this is games where where the baseline brightness/contrast is all over the place because of differences in the art assets and game engine. Because not all displays - TVs (CRT, LCD, Plasma, OLED), monitors, DLP and traditional projectors - are calibrated to a common reference standard and because different people have different preferences for colour, contrast, brightness, saturation, the only real way to ensure the user will be able to discern important elements is to let the user calibrate each game.
 
Xbox One already provides for OS level display calibration steps to help the user adjust their display.
 
I think the issue is that every game comes with settings. There ought to be a system-wide setting that every game uses instead of one per game. That said, I find some games benefit from having their gamma ramped up than others, especially in different lighting conditions. I'd much rather change the game setting than jiggle about with the TV setup, and anyone who's done a decent calibration of their TV will definitely be opposed to having to change their TV settings to get the picture they want from a game. I don't see anything wrong with game options myself, and would recommend a system-level reference point that games default to but which can be changed per game as the user wants.
 
I'd much rather change the game setting than jiggle about with the TV setup, and anyone who's done a decent calibration of their TV will definitely be opposed to having to change their TV settings to get the picture they want from a game.

I completely agree. Once you've had your TV professionally calibrated for your ideal activities (multiple settings saved), watching movies during the day and watching at night in dark room and generally playing games, you will never mess with your TV settings just for a single game.

Messing with TV set settings is the wrong way going about it. This typically means you've spent more on your game console than you did on your TV set. Do yourself a big favor and pickup a great TV set and have it professionally calibrated for your environment.

I prefer to have an overall OS level settings, but still want individual game settings. Some games are mastered too dark or sometimes too bright, so slightly bumping gamma for that one game is perfect.
 
Every game is rendered differently, I don't think there's any way to create a single "universal" gamma/brightness adjustment that will work 100% perfectly with everything.

I don't see the big deal, though... The total amount of time I've spent calibrating brightness in games over the last year or so probably amounts to about thirty seconds.
 
With gamma correct renderers and work flows, a standard calibration should do the trick and games wouldn't need to be adjusted. This should be the case for more and more games.
 
With gamma correct renderers and work flows...
That's the problem, not everyone is using the same settings and renderers. Nor will they, I would imagine. Sure, it'd be nice, but I'm not going to hold my breath.

The setting makes a huge difference in Watch_Dogs. The game is, by default, at a much higher setting than it needs to be. Lowering the "brightness" to near-zero actually yields some pretty decent lighting, especially in the day. Gets rid of that "flatness" that everyone (including myself) was complaining about.
 
I have yet to come across a game where the default setting doesn't look perfectly fine on my tv, so it really doesn't bother me.
 
^Agreed (PS3/PS4).

I have calibrated my display to HD video standards (rec.709) which is very similar to sRGB. I have never had to adjust the in-game brightness because the game looked off.
 
Nothing else works this way, my dvd player doesn't pop up a slider at the beginning of every disc prompting me to set brightness on a software slider. If you're screen is calibrated wrong, you'll be correcting every game. A simple OS level calibration image (I don't mean a universal slider) with some instructions on how to use it to set your screens brightness control. This way, you'd set it up once and wouldn't have to touch anything again and developers wouldn't need to include these calibration images.


^Agreed (PS3/PS4).
I have calibrated my display to HD video standards (rec.709) which is very similar to sRGB. I have never had to adjust the in-game brightness because the game looked off.
You're right, it does seem developers are putting out consistent levels (it's pretty awful if they aren't). I also calibrated to 709 and haven't had to adjust a slider.


Messing with TV set settings is the wrong way going about it. This typically means you've spent more on your game console than you did on your TV set.
TV settings are on a per input basis, you wouldn't mess levels up for other devices.

The setting makes a huge difference in Watch_Dogs. The game is, by default, at a much higher setting than it needs to be. Lowering the "brightness" to near-zero actually yields some pretty decent lighting, especially in the day. Gets rid of that "flatness" that everyone (including myself) was complaining about.
I don't see the same thing. Dropping it was definitely causing blacks to be crushed at night. How did you set the brightness control on your screen? Do you still see the logo in WD when you dropped the setting right down?
 
Last edited by a moderator:
Nothing else works this way, my dvd player doesn't pop up a slider at the beginning of every disc prompting me to set brightness on a software slider. If you're screen is calibrated wrong, you'll be correcting every game.
Some films are stupidly dark and could do with brightening. Bare in mind that the artist's vision might not match the subjective preferences of the viewer. What the artist thinks dark and moody on his calibrated display, the user might consider hard to see on his (possibly not) calibrated display. And of course there are different conditions of viewing too. On a bright day, the picture could appear too dark on a movie. You've no option but to mess about with screen settings to compensate, depending on curtain situation.

There's no problem with settings and options. You also only have to set it once per game, which against the many hours of gameplay you'll have, a few seconds of adjusting a gamma slider doesn't seem a terrible inconvenience. And I expect the default standard to be normalised for calibrated displays anywhere, so if your TV is calibrated, you should be getting the artists vision from the default setting without having to change a thing. I've tend to find the suggested gamma level where one symbol is not visible and the other barely visible produce a ghastly picture and I just tweak it till the game looks good (as close to what I see in real life with mine own eyes), just the same as I do video picture (the THX setup on the Disney BRDs also produced a completely useless image which I promptly ignored and went back to my own settings).
 
I feel that games are not made to take advantage of the vividness of colours or the deepness of black, the contrast in short, that modern TV/panels can produce.
I often can set the in-game brightness to max and yet that's not enough to "barely see the symbol on the left/right" simply because my Tv can produces very deep blacks BUT I instead don't have to alter brightness much on a old TV that is only capable of producing grey blacks.
 
I feel that games are not made to take advantage of the vividness of colours or the deepness of black, the contrast in short, that modern TV/panels can produce.
That's an interesting way of looking at it. I've often wondered if modern video game visual design was very strongly influenced by the horribleness of modern TVs, hence all the blown-out lighting effects to emphasize contrast. CRTs have excellent static contrast, which I suspect is why so many sixth-gen and older games look rich and deep on their native displays, but washed-out and flat on modern LCD panels.

But I guess we are slowly making our way back to using half-decent display technology.

However...

I often can set the in-game brightness to max and yet that's not enough to "barely see the symbol on the left/right" simply because my Tv can produces very deep blacks BUT I instead don't have to alter brightness much on a old TV that is only capable of producing grey blacks.
This sounds like something fishy is going on. Being able to produce deep blacks shouldn't murder contrast in dark areas of your image. Bringing the black level up to grey doesn't make dark details easier to observe, if anything it actually forces things to be crushed closer together since you have less total range.
 
Last edited by a moderator:
I feel that games are not made to take advantage of the vividness of colours or the deepness of black, the contrast in short, that modern TV/panels can produce.
I often can set the in-game brightness to max and yet that's not enough to "barely see the symbol on the left/right" simply because my Tv can produces very deep blacks BUT I instead don't have to alter brightness much on a old TV that is only capable of producing grey blacks.

Sounds like you're crushing black for some reason.
 
I don't have crushed blacks simply the black level of my "newer" TV is: 0.009 cd/m2 opposed to 0.04 cd/m2 of the "older" TV.
 
Last edited by a moderator:
I don't have crushed blacks simply the black level of my "newer" TV is: 0.009 cd/m2 opposed to 0.04 cd/m2 of the "older" TV.

Those calibration screens are simply a black background with a just above black section like a logo. If you can't see the logo then by definition you are crushing blacks, i.e black and just-above-black are indistinguishable. Leave the software slider at default and up your brightness control on your TV until the logo is just visible.

Some films are stupidly dark and could do with brightening. Bare in mind that the artist's vision might not match the subjective preferences of the viewer. What the artist thinks dark and moody on his calibrated display, the user might consider hard to see on his (possibly not) calibrated display. And of course there are different conditions of viewing too. On a bright day, the picture could appear too dark on a movie. You've no option but to mess about with screen settings to compensate, depending on curtain situation.

There's no problem with settings and options. You also only have to set it once per game, which against the many hours of gameplay you'll have, a few seconds of adjusting a gamma slider doesn't seem a terrible inconvenience. And I expect the default standard to be normalised for calibrated displays anywhere, so if your TV is calibrated, you should be getting the artists vision from the default setting without having to change a thing. I've tend to find the suggested gamma level where one symbol is not visible and the other barely visible produce a ghastly picture and I just tweak it till the game looks good (as close to what I see in real life with mine own eyes), just the same as I do video picture (the THX setup on the Disney BRDs also produced a completely useless image which I promptly ignored and went back to my own settings).
I don't mind the option to tweak brightness remaining for players (filthy deviants who want to stomp all over the artist's original vision ;)). The images developers are putting in their games are purely to calibrate a screen, e.g the typical black background and logo. If they were to tweak the overall presentation per the user's preference, a gameplay screen would be much more useful. I guess what irks me is, developers put so much effort into making aesthetically pleasing games and Sony haven't even provided the absolute basic functionality to ensure what the user sees is somewhat similar to what the developer intended. If they do want to ensure this, they all have to implement their own calibration page in-game.

When I first got my PS4, none of the games I owned had any calibration screen whatsoever. The only option I had was to burn a calibration DVD and use that. Getting user's to correctly set the brightness control on their TV with the help of a test image is possibly the cheapest visual upgrade Sony could've developed!
 
Last edited by a moderator:
I guess what irks me is, developers put so much effort into making aesthetically pleasing games and Sony haven't even provided the absolute basic functionality to ensure what the user sees is somewhat similar to what the developer intended.
That's true of audio as well. At the end of the day, the creators have no control over the experience of the users. They can't ensure, or trust, that users have a decently calibrated screen (and plenty have the wrong colours or ridiculous sharpness etc. when they don't set up their TV even remotely), nor that the end user's audio is equalised to match the authoring environment. At the end of the fay, the devs should just aim for a decent look across devices. If they want exactly that certain experience, they should be working on installation art. ;)

Heck, even movies can't be sure to offer exactly the right experience, because the movie projection is going to provide a very different picture to a high-end OLED screen. Should the artists be aiming for best quality on a projector with crappy blacks, or OLED and mess up the cinema experience?

As long as the end user is happy, the developers' job is done.
 
Those calibration screens are simply a black background with a just above black section like a logo. If you can't see the logo then by definition you are crushing blacks, i.e black and just-above-black are indistinguishable. Leave the software slider at default and up your brightness control on your TV until the logo is just visible.

Why should I change the brightness settings on my already calibrated TV?
Also I can see the "logo/symbol" in some game using the in-game calibrations screen but not in every game.
 
Back
Top