*Console Display Calibration Issues

I'm a little confused about the limited range vs full range issue, because I still see a lot of review calibrations done on limited range, for new high-end tvs.
 
The main reason why I leave everything on Limited is that when I did try to switch to Full, it was black crush galore on my TV - which accepts Full. A quick change in settings would fix this, and games were fine, however every time I'd watch a Bluray movie the image wasn't right. So I'd end up having to change brightness settings back and forth every time I'd switch from playing games to watching movies.
Ain't nobody got time for that!
 
I'm a little confused about the limited range vs full range issue, because I still see a lot of review calibrations done on limited range, for new high-end tvs.

Check out the AVS forums. The majority of recorded content is Limited Range. Some content and all games are Full Range. Most calibration articles setting up the best playback quality for recorded media.

Depends what you do most with your TV, watch film or pay games. Quite a lot of modern TV can change their profiles to adapt accordingly. I always have my PS3\4 set to full range as it changes to Limited automatically when required.
 
I didn't even realise such an option existed! So I've just checked in NVCP and indeed there is an option in there to switch between full and limited! And it turns out I've had everything set to limited. So I've just switched to full and hey presto, everything looks better on both my monitor and TV. Awesome!!
 
The main reason why I leave everything on Limited is that when I did try to switch to Full, it was black crush galore on my TV - which accepts Full. A quick change in settings would fix this, and games were fine, however every time I'd watch a Bluray movie the image wasn't right. So I'd end up having to change brightness settings back and forth every time I'd switch from playing games to watching movies.
Ain't nobody got time for that!
All Blu-Rays and pretty much all video content are encoded to YCbCr 420 (16-235) color space, therefore in most cases, you should output YCbCr for Blu-Ray to avoid any unnecessary conversions. On some displays, outputting RGB may look better, but YCbCr is better 99% of the time.

On some displays, the RGB range setting affects YCbCr signals when it should only affect RGB signals, therefore if you want Blu-Rays to look right, then you need to use Limited, even if your display supports full range. On most displays, the RGB range setting is either grayed out or has no affect on YCbCr inputs, so it's safe to use RGB full range and YCbCr for Blu-Ray. Some people also have other devices that share an HDMI port (through an AVReceiver or switcher) and therefore some might be forced to use Limited as well, to avoid conflict with other devices.

On the PS3/PS4, if you want to use RGB Full range, you can set Blu-Ray to output RGB. That way, both games and Blu-Rays/disc-based video will all output RGB and then you can set your PS3/PS4 and display to full range. Blu-Rays will no longer output their native color space and be converted to RGB, but it's an option if you prioritize video games over Blu-Ray.

I didn't even realise such an option existed! So I've just checked in NVCP and indeed there is an option in there to switch between full and limited! And it turns out I've had everything set to limited. So I've just switched to full and hey presto, everything looks better on both my monitor and TV. Awesome!!
It's been a while since I've owned an NV card, but when I did, it always output full range AFAIK, and the RGB range setting was for video playback only.
 
Last edited by a moderator:
This thread was extremely helpful. Just switched to full RGB on my monitor. Looks so much better now. Going to check TV to see if the option is available there.
 
It's been a while since I've owned an NV card, but when I did, it always output full range AFAIK, and the RGB range setting was for video playback only.

Lol you're right. I was fooled because I was only looking at the change to my desktop when I changed the setting, and I use dreamscene on my desktop which is basically a looped video. Doh!
 
How are you connecting your card to your monitor? Nvidia has a bug with HDMI connections where it will always send limited range even for monitors, there's a program you can run to fix this though.

http://blog.metaclassofnil.com/?p=83

I use dual link DVI so no problems here. I can definitely see the difference in video but as it turns out it has no effect on still images. I'm not sure about games as it's hard to do an "on/off" comparison.
 
Better looking and easier to control (insofar as higher framerates and more control peripheral options) I'll give you. Not sure about the rest though,

That's a straw man argument. No-one mentioned "wimpy little 2Ghz CPU's" (however you define one of those). We are talking about an AMD FX 6300. A CPU with a base turbo clock speed of 3.5Ghz and a turbo clock 4.1 Ghz. With well over double the per core performance of the Jaguars in the XB1 and the same number of available cores to the game, that CPU should quite easily be able to spare a full core or two to audio.

I'm not saying for sure that's is or isn't enough to outperform SHAPE in game related audio. I don't know enough about SHAPES relative performance in that regard and neither do you but their's no doubt that the CPU would offer a lot more flexibility for developers to programme custom audio solutions should they so wish.

And besides, why bring up SHAPE (or Kinect) in the context of Titanfall? Is Titanfall demonstrating some more impressive audio solution than that which is available on the PC? If not (and there is no evidence of that) then it's of no relevance to the DF article being discussed.
I brought SHAPE because they faintly mention it. And yes, as you say, I think that a CPU core belonging to a very good CPU could certainly replicate most of the dedicated audio features, but to what extent is unknown since we are talking about specialised hardware. Any input bkilian could give would be nice, because flops wise they might be close but flops alone aren’t going to show how capable certain hardware is.

This is another Straw man. No-ones saying that Durango isn't more powerful than some PC GPU's. For example it's more capable than AMD GPU's from at least the Radeon 7770 downwards and Nvidia equivalents.

But we're not talking about some random PC's GPU's, we're (or rather DF is) talking specifically about the GTX 760. And that GPU has 72% more shader performance than Durango on paper and 142% more pixel fill rate, geometry performance and texturing performance. And that's assuming the XB1 has access to 100% of it's GPU's resources which you're 8% comment points out that it doesn't. So even when it gets that 8% back, the extra performance (and more in fact since the XB1 will still reserve a small percentage of GPU time) of the 760 detailed above still stands.
As for the GPU you mention, that’s a hell of a GPU. A PC making justice to it would be much more expensive than the Xbox One. An excellent choice in the long run too, anyways. Which brings me to the point of the life span of the PCs.

The PC has infinite backwards compatibility –as joker454 pointed out sometimes- :) and that’s great, but the actual life span of a PC is shorter than a console. Especially now when next generation of consoles blur the line in terms of performance vs PC performance. Now the games can run at superior than HD resolutions and better framerates.

The only console which could handle 1080p in the PS2/GC/X-box era was the original Xbox. I wonder about the framerate but that’s fact. Now we have consoles which are several times superior than those and you need PCs as 200% more powerful to really notice a difference.

And now when that line is blurred, surpassing the capabilities of a console is a tough task. Consoles have a longer life span than PCs, and the GTX 780 for instance, is a great GPU for a PC, and it’s going to last for a few years. :smile2:

What can make PCs to look as if they have a longer life span is that you can upgrade it, am I right? So you can even keep the same case. This can make you feel the illusion of thinking you are basically using the same rig...

But would be it the same rig you bought 4 years ago? Certainly not. You just don't realise how much money you have spent on upgrading the rig to get it to play newer games at 30fps. :smile2:

Decent CPUs (for gaming) are like 100+$. Then there is the mobo. RAM (many PC gamers have the habit of filling all their DIMM’s ‘cos it’s cool :p ).

Then there is the GFX card, some GPUS are crazy expensive, like the Titan. A decent soundcard, and the 5.1 system laying around. SSDs… Display… PSU… UPS…

Not to mention the cash spent on moddeing your rig with the round cables, better fans and also those cold cathode lights.

And then the software! Your OS… you do pay for all these stuff, don't you?

So once you try to sum up all the expenses you realise how much you paid for your 4 years old rig (half the life span of the Xbox 360), to run Crysis 3 at 30 fps today. I’d say the expenses are a lot more than a 500$ console. :smile2:

At the pace of which the hardware is improving these days the PC Digital Foundry is using will become obsolete in no time. Whereas the PS4, Wii and Xbox One will improve.

And then the Xbox One is a console meant to be backwards compatible forever, which will only expand the life cycle of the console. : )

What I don’t have, alas, is a fine rig to mess around, but I am playing many games on my laptop with a i5 CPU and the Intel HD3000 and I managed to run Heroes of Might and Magic V at 2560x1600, J even when I throttle down the laptop to keep it cool.

And Heroes 3 and 5, in the top 3 games of my life, runs butter smooth on my laptop. Also effectively using one of Shifty’s favourite AA methods, SSAA for free.

I know my PCs relatively well, just as developers know consoles better over time, and now I can mess around easily, and don’t feel too dumb, so I can play games on the PC without much hassle. I love the PC, its customisation and it has more uses, and I can understand you and Davros. :smile2:

But it’s also true that PCs life span is shorter. In 5 years a good PC will whip the floor with a console.

That’s the upgradability factor, but consoles will run fine. No need for astronomical full screen extremely high resolution (8096 x 4000) gaming on a PS4 or Xbox One which can give worth thousands of hours of fun in easily accessible online multiplayer environments.

Scalability being the advantage of a PC, in the future PC games could have the ability to change the amount of detail which are converted from their console counterparts though.

I don't see why they are unfair. They quite clearly state the spec of the PC that they are comparing too and they've gone into detail about both the spec and cost of that PC in the past. It's a £500 PC so yes it costs more and lacks some functionality of the XB1 (and vice versa) but they aren't moving the goal posts or comparing the console to a PC that costs 10x as much. It seems to me like a pretty fair comparison.



I'm not sure what you mean here? What are DF insisting on and how is it unfair?
By that I meant that I do enjoy sitting around with friends or siblings playing a game, it is a different type of multiplayer experience rather than being connected to a server hundreds of kilometres away that many people you’ve never laid eyes on do also connect to.

Additionally, it is fun to sit in a comfortable seat and take turns at playing a SP game. That doesn't work the same with most PC games. That wasn’t the case time ago on the PC though, with Heroes of Might and Magic you could play hot seat with a friend, your wife, siblings, etc. : )

Finally, it isn’t fair ‘cos they focus on comparisons and I would love to read a DF article on Powerstar Golf, where they find out the actual resolution of the game and talk about its great sound.
 
Did anyone noticed bug in PS4 web browser when PS4 is set to Full RGB output? If you open new web page than content in browser window is displayed in Limited RGB range, even though everything is set to Full RGB.
Best page to test this is http://www.lagom.nl/lcd-test/black.php
Interesting is, if you leave this page open, next time you start web browser everything is displayed correctly.
 
RGB full range for PS4 has always been broken for me, notice absolutely no difference in either setting. But PS3 and 360 worked perfectly.
 
Limited range is undoubtedly not the best choice in all situations. If your display supports full range, TV or monitor, full is the better choice in terms of PQ. In most cases, if everything in the chain is set appropriately and matches, there is little to no difference between RGB Full or Limited.


They recommend Standard range for compatibility purposes, not for optimal PQ. It is true that the video standard is (YCbCr) 16-235, but that has little to nothing to do with games and what these consoles output.

Keyword is video material, not videogames.

And your statement is wrong, many TVs made in the last 3-5 years do support full range. Most of the major flatpanel makers do: Samsung, LG, Panasonic, Toshiba and Sony. 4 of those brands are in the top 5 in flat panel sales.


Yes, RGB full range sucks, if and only if your display doesn't support it. Otherwise it is the optimal setting for PCs/console games.

Blu-Ray players/video players are a different story since videos use a different standard. Fortunately for the PS3/PS4 and I'm sure the XB1, you can choose to output a different colorspace for Blu-Ray, where YCbCr is in most cases optimal. On the PS3/PS4, games always output RGB, therefore Full range is optimal.

But like I said above, there's generally little to no difference between full/limited if set properly so it's not the end of the world if your display only supports limited range.
:smile2: Okay, thanks. I decided to follow the advice of techy savvy people –you, Shifty, function, DSoup, etc- and switched to Full Range on the console and used Computer mode (Computer is awesome for calibration as it displays the image without any post-processing, and imo image quality is better without it, hence simplifying the calibration process) on the TV to re-calibrate the brightness and the contrast. I thought that there was an issue with Full RGB support on my TV –a panel from 2013- but maybe there isn’t, now that I tried to pass the black test level meant for PC displays (using the image from the Lagoon LCD test I shared in a post via DLNA on the Xbox One, setting it to Full range), which is full range.

Not a huge difference to me, both look good in my eyes. I don’t know how to describe it with words. Full range looks certainly better though. It looks like some black and white levels are accentuated. For instance, the letters in white against the very dark background of the Xbox One interface shine a lot more than before, as if they were highlighted much more intensely. This is a pretty easy test to make because I performed this in the Display and Video configuration of the console where you can quickly switch between Limited RGB and Full RGB. I had calibrated the TV almost close to perfection before so I’d say that Limited Range can also look fine. I am torn between both, but I switched to Full Range because the contrast is even more enhanced and yes it looks better! Contrast is something the TV I play games and watch movies on always has been good at.

That being said, be it using Limited or Full RGB, I am glad that games looking better on a console is not simply a byproduct of your television, like in the PS2/GC/X-box days, where games looked better than they actually should just because of pre-HD era of TVs rendering them full screen in interlaced mode.

You made poorly reasoned arguments, he addressed them...you then respond with this. Really, do you have any idea what forum you're on?

Well, yes...and? What is the point of this comment? What's relevant is price/performance, the GPU's that the Xenos2 "kicks the pants" are either ancient or integrated.

Indeed.

8% more.
My apologies to him if it sounded bad, but that wasn’t the intention –I believe pbjliverpool realised this.

I wouldn't feel offended if someone called me a PC snob or console snob.

I kiiiiiiiiiiinda love and respect the PC, my (many) PC games can be counted as an argument for that. It has nothing to do with my opinion on him, I always liked him in the forums, always will and that’s fact, I think pbjliverlpool knows this.

My opinion is that PCs are clearly the mack. Still… that doesn’t draw the whole picture.

Despite my relatively deep understanding of the PCs I have, I am a bit bias towards the console as my preferred gaming platform because, especially nowadays, I treasure time-saving and ease of use.
 
RGB full range for PS4 has always been broken for me, notice absolutely no difference in either setting. But PS3 and 360 worked perfectly.
Many people assume that there should be a huge dramatic difference, but there should be little to no difference between full/limited, assuming you set your display accordingly. A good indication of improper setup would be either crushed blacks and/or gray blacks. Perhaps it's your PS3/360 that is not setup properly.

Limited will map black to 16 and white to 235; if your display is set to full range and expects black to be at 0 and white to be at 255, then blacks will look gray. Full will map black to 0 and white to 255; if you have your display set to limited, blacks will be crushed. Now in all cases, it may be a matter of adjusting your Brightness (black level) and Contrast (white level) controls to get rid of black crush and/or gray blacks, but if you have it setup incorrectly, then you will be compensating for it. If setup correctly, Limited or Full, then you shouldn't have to touch your Brightness or Contrast control in either situation.
 
Last edited by a moderator:
Not a huge difference to me, both look good in my eyes. I don’t know how to describe it with words. Full range looks certainly better though. It looks like some black and white levels are accentuated. For instance, the letters in white against the very dark background of the Xbox One interface shine a lot more than before, as if they were highlighted much more intensely. This is a pretty easy test to make because I performed this in the Display and Video configuration of the console where you can quickly switch between Limited RGB and Full RGB. I had calibrated the TV almost close to perfection before so I’d say that Limited Range can also look fine. I am torn between both, but I switched to Full Range because the contrast is even more enhanced and yes it looks better! Contrast is something the TV I play games and watch movies on always has been good at.
If contrast looks boosted when you switch to full range, that sounds like you either just crushed your extreme values, or you were previously looking at a washed out image on account of sending limited range to a TV expecting full.

Either way, one of your settings is probably incorrect.

That being said, be it using Limited or Full RGB, I am glad that games looking better on a console is not simply a byproduct of your television, like in the PS2/GC/X-box days, where games looked better than they actually should just because of pre-HD era of TVs rendering them full screen in interlaced mode.
I'm not sure what you're talking about.

Most of what makes SD CRTs look good is just phosphors looking good.

The typical video signals that got used by most people (NTSC and such) are extremely low-quality and prone to artifacts.

Interlacing on SD CRTs sort of has the perk where 60fps games can sometimes look almost "480p60" good, though it's not quite there, and 30fps games have horrible combing artifacts (480p30 to a 60Hz VGA monitor looks much cleaner in motion than sending 30fps video to an SD CRT, even if you're using high-quality video signals for both displays).

SD CRTs were in many ways a hindrance to console IQ during the sixth gen.
 
Many people assume that there should be a huge dramatic difference, but there should be little to no difference between full/limited, assuming you set your display accordingly. A good indication of improper setup would be either crushed blacks and/or gray blacks. Perhaps it's your PS3/360 that is not setup properly.

Limited will map black to 16 and white to 235; if your display is set to full range and expects black to be at 0 and white to be at 255, then blacks will look gray. Full will map black to 0 and white to 255; if you have your display set to limited, blacks will be crushed. Now in all cases, it may be a matter of adjusting your Brightness (black level) and Contrast (white level) controls to get rid of black crush and/or gray blacks, but if you have it setup incorrectly, then you will be compensating for it. If setup correctly, Limited or Full, then you shouldn't have to touch your Brightness or Contrast control in either situation.

funny that you mentioned it, I just remember when i check my the jpeg capture with rgb full and limited, they do look different. I don't understand how something like PS3 would just work while PS4 doesn't with same setting and same input.
 
funny that you mentioned it, I just remember when i check my the jpeg capture with rgb full and limited, they do look different. I don't understand how something like PS3 would just work while PS4 doesn't with same setting and same input.
What display do you have (brand, and if so, model as well). Displays as well as the PS4 have automatic settings for RGB range. My display's auto mode seems to be able to correctly pickup the RGB output range from the PS4, but it cannot correctly pickup the PS3's RGB output range. It's probably just a matter of setting up your display appropriately, if your display does in fact support RGB full range.
 
The main reason why I leave everything on Limited is that when I did try to switch to Full, it was black crush galore on my TV - which accepts Full. A quick change in settings would fix this, and games were fine, however every time I'd watch a Bluray movie the image wasn't right. So I'd end up having to change brightness settings back and forth every time I'd switch from playing games to watching movies.
Ain't nobody got time for that!
There's a setting on PS3 for outputting videos in Full RGB. It 'upscales' the range for the RGB out and the video's look great (not really much difference, but they won't be greyed or crushed.).

If contrast looks boosted when you switch to full range, that sounds like you either just crushed your extreme values, or you were previously looking at a washed out image on account of sending limited range to a TV expecting full.

Either way, one of your settings is probably incorrect.
I'm not sure about that. My picture looked better on Full RGB than on 'gimped range' after fiddling about with them both to try and get best picture. Thing is, the TV is processing the image. You may get a different gamma curve on Limited RGB, say, and/or it may handle colours differently with some 'video tweaks' on limited RGB while using pure RGB on full range signals.

If you have your set calibrated correctly for the colour space, worst case, Full RGB will look exactly the same, and best case, it'll look better. It'll also give content creators a little more room to play with for quality. There's no reason to use limited RGB any more. Hell, we're even looking at sets with 12 and 16 bit per pixel colour fercrissakes! Using limited range is similar to the audio industry's compressing of signals to lose the full dynamic range. It's an idiotic practice that assumes poor user setups and tries to compensate on the authoring end, instead of a proper authoring>presentation chain that preserves the audio as close to the original as possible.

We have the same with display gammas that I'm sure Laa-Yosh could rant about. In this age of digital signals, we should be preserving exactly (save compression schemes) the source material as captured throughout the presentation chain. The only thing preventing this is crazy legacy setups.
 
There's a setting on PS3 for outputting videos in Full RGB. It 'upscales' the range for the RGB out and the video's look great (not really much difference, but they won't be greyed or crushed.).

I'm not sure about that. My picture looked better on Full RGB than on 'gimped range' after fiddling about with them both to try and get best picture. Thing is, the TV is processing the image. You may get a different gamma curve on Limited RGB, say, and/or it may handle colours differently with some 'video tweaks' on limited RGB while using pure RGB on full range signals.

If you have your set calibrated correctly for the colour space, worst case, Full RGB will look exactly the same, and best case, it'll look better. It'll also give content creators a little more room to play with for quality. There's no reason to use limited RGB any more. Hell, we're even looking at sets with 12 and 16 bit per pixel colour fercrissakes! Using limited range is similar to the audio industry's compressing of signals to lose the full dynamic range. It's an idiotic practice that assumes poor user setups and tries to compensate on the authoring end, instead of a proper authoring>presentation chain that preserves the audio as close to the original as possible.

We have the same with display gammas that I'm sure Laa-Yosh could rant about. In this age of digital signals, we should be preserving exactly (save compression schemes) the source material as captured throughout the presentation chain. The only thing preventing this is crazy legacy setups.

Well... I spent the whole evening playing around with my TV (a top of the range plasma released last year) and also researching the TV itself on the internet. Cause I obviously have nothing better to do in my spare time.

Turns out the EU version of this supposedly amazing TV - and in fairness, it is really amazing and was voted best TV last year or whatever - has been, for lack of a better word, mutilated and won't even have an option for full/limited RGB input. The US version of course does have it, and people seem to be using Full on it in the US.

So, having tried all evening to switch back and forth between full and limited, seen all the options, switching back and forth a million times to make sure... The PS3 will remain on Limited as the TV definitely crushes blacks on Full, and since there is no option to set it on Full, there is very little I can do. I was kinda hoping the TV would display blacks correctly by somehow automatically detecting what signal was going in, but no. The TV is just gimped in Europe.

The initial 'pop' I could definitely see when switching to Full was really just a whole lot of black crush.

So really, it's not as simple as you say, especially seeing how manufacturers still unexplainably and unforgivably decide to release gimped version of even their best TVs in Europe, and letting their US customers enjoy the full range of options they have created.

There is just no excuse for what I found out tonight and I am blaming YOU B3D geeks for ruining my night.

I hate you all.

Goodnight.
 
Back
Top