*spin-off* Calibration

Arwin

Now Officially a Top 10 Poster
Moderator
Legend
It's a good question that can only be answered by a decent examination of the respective day/night cycles of the game. Syncing TODs in AC2 is fairly easy - but not in that clip which is preceded by a fair bit of gameplay.

I really wanted to say that I appreciate your recent efforts in doing comparisons using calibrated contrast/lighting settings. Wish everyone did that (looking at you, Lens of Truth ;) ).
 
I really wanted to say that I appreciate your recent efforts in doing comparisons using calibrated contrast/lighting settings. Wish everyone did that (looking at you, Lens of Truth ;) ).

To be fair, for the most part I don't calibrate and go for a WYSIWYG approach just like LoT, but in the case of MW2, it simply had to be done as I simply can't imagine that default settings were representative of the framebuffer - to the point where people were going to get diminished IQ on their own screens if it wasn't pointed out.

I *think* I've figured out a way to ensure framebuffer/HDMI match-up on all titles going forward but really need some time to get the workflow right.
 
Really, that's the developers job, not sites like Eurogamer and Lens of Truth.

No, it's not. These sites should a) pick the right RGB output settings for their capturing device (e.g. Full RGB / Full Expanded or whatever the third option is called on the 360) and B) compensate for the differences in default brightness and contrast settings coming out of the 360/PS3, because they are different.

Think about it. If you hook up a console to your television, you calibrate your TVs settings to the output of your console. If you'd then hook up both at the same time and compare them split-screen, you'd see it the way it should be when comparing two games.
 
No, it's not. These sites should a) pick the right RGB output settings for their capturing device (e.g. Full RGB / Full Expanded or whatever the third option is called on the 360) and B) compensate for the differences in default brightness and contrast settings coming out of the 360/PS3, because they are different.

Think about it. If you hook up a console to your television, you calibrate your TVs settings to the output of your console. If you'd then hook up both at the same time and compare them split-screen, you'd see it the way it should be when comparing two games.

If you are curious about what settings LoT uses for their captures you might want to take a look at their F.A.Q. page:

Lens of Truth: F.A.Q.
 
No, it's not. These sites should a) pick the right RGB output settings for their capturing device (e.g. Full RGB / Full Expanded or whatever the third option is called on the 360) and B) compensate for the differences in default brightness and contrast settings coming out of the 360/PS3, because they are different.

Cool. And this is exactly what Lens of Truth do. Check their FAQ, as they are very, very clear on how they capture.

However, it's not up to them to fiddle around with any available in game colour/brightness/contrast settings in an attempt to create a "match" across platforms. That's the developers job, purely and simply.

That Mr Leadbetter did that with MW2 is admirable (though as far as I'm aware that's the only title to date that he's done it for) and can possibly be justified by the fact that the first thing you see on starting the game is the brightness/contrast screen. However, that the 2 platforms were so far apart on the default settings is down to the developer only and they should be held to account for poor calibration.
 
Well, it would certainly help level the playing field for comparisons, which are the entire point of these articles, no?
 
Cool. And this is exactly what Lens of Truth do. Check their FAQ, as they are very, very clear on how they capture.

Ok, what they do is not bad in principle - if you want to compare movie playback. But this is games. What the GPUs put up there as the result of rendering a scene is a different kind of thing, as far as I know. But I admit that the situation is pretty complicated.

However, it's not up to them to fiddle around with any available in game colour/brightness/contrast settings in an attempt to create a "match" across platforms. That's the developers job, purely and simply.

I've never been talking in-game, just the basic display settings. It's not like you can watch these combined comparison shots on different displays when you look at them through a browser, after all. The best I've been able to do is do a histogram analysis of screenshots showing whether or not one or the other picture has lost color information due to too much contrast. That's the kind of information I'm interested in, because once color information is lost from the default output signal, you can't bring it back in no matter how you calibrate your TV.

That Mr Leadbetter did that with MW2 is admirable (though as far as I'm aware that's the only title to date that he's done it for) and can possibly be justified by the fact that the first thing you see on starting the game is the brightness/contrast screen. However, that the 2 platforms were so far apart on the default settings is down to the developer only and they should be held to account for poor calibration.

You can call it poor calibration, but if all games are like this, then you'll want your game to be similar in settings to all the other games, so that people who've properly calibrated their display once don't have to keep doing so again and again for different games. Even if some of those games offer in-game settings that can be saved per game, it's better if you never have to adjust it. I often check the calibration settings if a game offers them, but so far the settings have almost always been pretty much perfect.

I think it would really help comparisons too in another field too. Sometimes, a game on the 360 may be using FP10 HDR and the PS3 version none. Sometimes the PS3 version may be using FP16 or either console could be using a custom form of HDR ... these basic differences in contrast/brightness make this so much harder to spot instantly.
 
But I admit that the situation is pretty complicated.

I won't quote your whole post, but that is the key part anyway ;)

However, in an attempt to answer some of your points, I'll outline my situation. I have a 50" Samsung Plasma with one HDMI port. So I use a switcher box (which works well as I use a Logitech remote for everything except the PS3 controls (for obvious reasons)). Into the switcher box goes my SkyHD, PS3, 360 and DVD Surround System. Of course, that means that the one HDMI input on my TV is set to levels that most suit my viewing requirements.

And it's pretty good. 360 games look great, with good colour, blacks and whites. PS3 BluRay's have the right balance with deep blacks and white whites, SkyHD looks good as do DVD's. PS3 games, on the other hand, can often look washed out as has been seen in many games since the console launched.

So what to do? Recalibrate the TV every time I play a game? Not really likely, is it.

So it's up to the developers. Or maybe Sony should have set the colours/contrast levels of the GPU to give a more pleasing image day one. It's long been known (and this as a fully paid up Nvidia fanboy) that Nvidia GPU's, especially from the Geforce 1-7 days, offered a slighly more washed out image than competing ATI GPU's. The internal colour calibration of the 8800's onwards has been much better.

Which again brings me back to the "complicated" argument. Everytime any comparison comes up, forums will be full of "Youve been pade by M$, my PS3 dont look liek that!!!!!" comments. However, with all of the processing that is done by modern HDTV's, it's not likely to. On top of users calibrating the colour and contrast, most have internal scalers along with an assortment of processing abilities such as sharpening and smoothing, etc.

But it's not the job of the pixel-counter to 2nd guess how the millions of PS3/360 owners out there have their TV's set up, and any attempt to do so just muddies the waters and would make the whole "like for like" comparisons fraudulent.

Imho of course :)
 
My TV is calibrated and PS3 games rarely look like they do on a lot of these comparison sites. Not to mention a lot of games have in-game brightness/gamma sliders.

I don't know if the 360 is actually like this, but in a lot of these PS3/360 comparisons, while the 360 version generally looks better because the PS3 version usually looks washed out, a lot of times the 360 version looks too dark where a lot of black detail is lost, which is another reason why calibration is important.

I also appreciate people putting both games on a level playing field by making the output look similar. And if there is a brightness/contrast difference in a comparison, I completely ignore it. With that said, it seems that the PS3 generally does output a higher brightness (black level) causing the washed out look, but again, I've never seen a PS3 game look washed out and my TV is properly calibrated so that both games and BR are outputting the correct levels.
 
I calibrated my TV as close to D65 as possible using a BluRay to do so (a friend of mine works in a TV shop and has access to the hardware).

While most games have perfect gamma settings from the get go (Resident Evil, Killzone 2 et al), some games are totally wrong.

I don't get why the developers, who have direct access to ALL data within the PS3 can't adhere to this standard though. I mean, it isn't all that hard to do, since they set "the mood" within their games. And this HAS to be done for all consoles, since they use different hardware.

Other things (lighting and whatnot) cannot be calibrated, obviously, but still, the overall colors of a game should always be calibrated to movie standards, since most gamers play on a TV, which is more likely to use movie settings than anything else.
 
When calibrating a TV, you calibrate to the game, but to a standard.

For movies, it's 6500k colour temperature/correct grayscale, no edge enhancement, no zoom, proper brightness and contrast, accurate colour decoding, etc.

Shouldn't the same thing apply to games too, since there isn't another standard to calibrate to?

So if there's a large difference in colour, contrast, etc. in two versions of the same game (after RGB Levels are set correctly), it should be the developer's fault.

Theoretically, there doesn't really need to be a brightness setting in the game, like movies, if everything was calibrated correctly, but then again there are people who still go "vivid" with their settings...
 
Yeah, I have to agree that it's more the developers fault as well. The PS3 can output both levels (16-235/Video or 0-255/PC). I think games could be outputting 0-255 (which is typically a PC standard), but the PS3 should be able to scale that to 16-235 (or vice versa). Developers should easily be able to match the look of both games just by playing them on a calibrated display and matching the display and consoles so that they're outputting/expecting the same color space.

Many PS3 owners think there is a 'better' setting when it comes to RGB when in reality there is no 'better' setting -- only the right setting. If your display is expecting video levels (16-235) then you should set your PS3 to RGB Limited; and if your display is expecting PC levels (0-255) then you should set your PS3 to RGB Full. TV's usually expect video levels and PC monitors obviously expect PC levels. If your TV has the ability to accept both (Samsung TV's for example) then you need to ensure that both your display and PS3 are on the same page. If you mis-match, you'll either get clipped black detail (outputting 0-255 on a 16-235 display) or you'll get gray/washed out blacks (outputting 16-235 on a 0-255 display).

I have my PS3 hooked up to my Samsung LCD that has the option for both, and this is typically the ideal settings for cases like mine (for videophiles like me who use their PS3 for BR as well as gaming):
PS3:
Blu-Ray output - Y Pb/Cb Pr/Cr (BR native color space is YCbCr 16-235 Video levels)
RGB Range - Limited (16-235 Video levels)
Y Pb/Cb Pr/Cr Super-white - On (this setting only affects YCbCr (so DVD/BR if your PS3 is setup so that BR/DVD outputs to YCbCr), meaning it has no effect on games/XMB which output RGB regardless.)

Samsung LCD:
HDMI Black Level - Low (16-235 Video levels)

After calibrating my TV via YCbCr 16-235, 99% of games look great without going into the in-game calibration and rarely do they ever look like they do in some of these comparisons. So with that said, in some cases, I think it's these websites that are at fault.
 
Last edited by a moderator:
I think games could be outputting 0-255 (which is typically a PC standard), but the PS3 should be able to scale that to 16-235 (or vice versa).

They`re outputting 0-255 and PS3 scale them to 16-235 just fine, everything depends on the "RGB Ful Range" settings in XMB . . . there was some thread about this topic
 
. . . and when we`re talking about display "calibration" we often hear about color temperature, by standards it should be 6500K (D65), but from my own experience perception of this color temperature hugely depends on colour temperature of ambient lighting in room.
For example, with overcast sky outside light coming through window will have color temperature around 9000K and screen calibrated to 6500K will look ridiculously yellow tinted.
Anyone observing same problem with calibration to 6500K or am I alone? I was trying to get used to D65 standard but I give up after few months.
 
True, but that's why you generally calibrate in the lighting conditions you're usually viewing the TV in. My theater/gaming room is in the basement so lighting is controlled. When I calibrate, I do so in a completely dark room.
 
I wanted to create a new thread to discuss this issue, but seeing it already exists i decided to just bump this one...
For movies, it's 6500k colour temperature/correct grayscale, no edge enhancement, no zoom, proper brightness and contrast, accurate colour decoding, etc.Shouldn't the same thing apply to games too, since there isn't another standard to calibrate to?So if there's a large difference in colour, contrast, etc. in two versions of the same game (after RGB Levels are set correctly), it should be the developer's fault..
Exactly. Although I wouldn’t blame any specific developer for this, just the whole industry ;) I’m surprised that a medium as focused on the visual aspect as video games, still doesn’t have a common standard in terms of something as basic as brightness/contrast and gamma values. It’s very apparent with some titles as the default brightness and contrast settings look completely off on a TV set calibrated using any “movie oriented” calibration disc/software.It’s especially annoying with console games that offer no way of changing any values in-game - which forces you change the settings of your tv to get a proper picture (in terms of shadow detail for example) only for that one specific game. I think this industry is mature enough to officialy come up with/adopt some standards that would make life easier for all of us.
 
I see quite a lot of THX certified games. Does that mean anything regards compatibility with THX calibrated TVs?
 
I use the same settings (ie. Cinema mode) for games on my Bravia that I do for movies.
And IMO it looks much better than the vivid/vibrant modes on HDTVs that some seem to prefer for gaming.

Also turn your set's sharpness down to the minimum, makes quite a large difference to how noticeable aliasing is.
 
I use the same settings (ie. Cinema mode) for games on my Bravia that I do for movies.
And IMO it looks much better than the vivid/vibrant modes on HDTVs that some seem to prefer for gaming.

Also turn your set's sharpness down to the minimum, makes quite a large difference to how noticeable aliasing is
.

Generally true. Most TV's have a neutral sharpness setting where anything higher adds edge enhancement/artificial sharpness and anything below actually blurs the image. In my experience, many TV's have the neutral point at 0, but not all. My old Hitachi 46" CRT rear projection TV was around 40 and my current LG 55LH90 is around 50/50 (you can adjust horizontal and vertical sharpness). Many Panasonic plasmas also have their neutral point at around 40. Samsung and Sony generally have it at 0.
 
Back
Top