*Console Display Calibration Issues

ps!concerning the screen inability to do 252 to 255 ,some feel it isn't a biggy but I don't agree! it mess the gamma and we tend to raise white more then it is recommended wich also cuase issue! bottom line?
srgb 100% unit is the only safe solution ! yep might have to sell that old unit rofl!
 
In regards to the Limited Range and Full Range debate and the Xbox One -or consoles in general-, HDTVs and stuff, that's what I found on the net.

Full Range might be better, but TVs don't seem to be made for them, in my humble opinion.

That's what Spears & Munsil say on the matter in their TV calibration tutorial:

Black Level or HDMI Black Level. This should be set to “Low”, “Video”, or “Standard”. It should not be set to “Normal”, “PC”, or “Extended”.


http://www.spearsandmunsil.com/portfolio/getting-started-with-the-high-definition-benchmark/

IGN on Xbox One and some games, when the console was launched.

http://www.ign.com/wikis/xbox-one/Xbox_One_RGB_Limited_(TV)_Vs._RGB_Full_(PC)

The choice of RGB Limited vs. RGB Full depends on what kind of set you have. The full RGB color range feature is primarily designed for PC monitors. If you're using it on a regular TV screen or home theater projector, you may notice that your display's black levels look "crushed" resulting in an overly dark picture that lacks some of the details. It's easier to see in-game than when looking at the menus. Boot up Ryse, for example. You'll notice that using RGB Full on a standard HDTV will result in detail in the shows being lost -- eg, Marius's shield may not show all the bumpy detail you'd usually see. In the end, it's all up to preference (even when using a PC monitor). Full may look more vibrant with better black levels -- or you may find that on your particular monitor, you're missing out on some detail (even after calibrating your settings).

As with a lot of settings -- bigger numbers, more filters, and more features enabled don't necessarily mean a better picture. Go with what looks best in your eyes. You can switch back and forth between the settings menu and a game to find the optimal setting for you.

djskribbles might approve. :) AVS Forums, someone's input explains why Full Range seemed to enhance the contrast on my TV -'cos it adds saturation-. Very interesting reply overall, I bolded just part of it anyways.

The person replying mentions that if you have a HDMI TV you can rest assured it is made, configured and calibrated for Limited RGB.

Whether you should use RGB Full or RGB Limited is not decided by the media you are playing (games, DVD, Blu-ray) but is instead related to the HDMI input on your display (and how it is calibrated).

RGB Full has black at 0 and peak white at 255. (This is sometimes known as PC levels)

RGB Limited has black 16 and peak white at 235. (This is sometimes known as video levels - and is the broadcast and production video standard used in both SD and HD video standard 601 and 709)

If you replay a Blu-ray or DVD (which will be encoded 16-235) in RGB Full, then your PS3 will remap the 16-235 video to 0-255, which has the side effect of clipping any <16 or >235 content that may be present on the Blu-ray or DVD (sometimes known as Blacker than Black - BTB - and WTW - Whiter than White) (*)

If your display is a regular HDMI TV, then it is almost certainly delivered configured and calibrated for RGB Limited - as this is what an HDMI DVD player, HD satellite receiver etc. will usually output by default if using RGB output. (As 16-235 is the video standard used for DVD, Blu-ray and TV broadcasts).

If you are feeding your device to a PC monitor - configured for 0-255 levels (which is what most DVI PC monitors are set-up for) then RGB Full would be the best choice.

If you feed a 0-255 signal (RGB Full) into a display configured for 16-235 (RGB Limited) then you will get crushed blacks and clipped whites (as the content below 16 and above 235 will not appear any different to content at 16 or 235) - and the image will also appear to be artificially more saturated (richer colours)

If you feed a 16-235 signal (RGB Limited) into a display configured for 0-255(RGB Full) then you will get grey-ish/milky blacks, dull whites - and the image will look washed out and de-saturated (less colourful) (As the 16 black level in the source will be displayed as grey as it is above the 0 black level of the display, and the 235 peak white of the source will not reach the 255 peak white of the display)

(*) BTB and WTW in 16-235 video shouldn't contain picture content on properly mastered material - but they should ideally be preserved to allow overshoot and undershoot on sharp transitions (particularly in analogue sourced content) to be preserved without clipping to avoid ringing.

http://www.avsforum.com/t/1265815/rgb-full-range-limited-or-full

Random guy on Reddit talking about full RGB on PS4. Not as technical but interesting nonetheless.

There is no advantage over full or limited range. Actually there is a disadvantage in full range when calibrating your screen the whole point of limited rage (16-235) is that you can have values that are blacker than black and whiter than white. 16 is BLACK. 235 is WHITE. If you can see say gray at 16 and actual black at say 13, well your tv is calibrated wrong and you bring down the brightness until black shows at 16. Same with the white range (for contrast). Now yes it will make it look more vibrant, but A. its wrong B. you will crush your blacks. if a game character is wearing say a gray shirt and a shadow of his arms is against the shirt there is a kind of gradual gray to black transition. If you put FULL on, it will cut that transition and just be black, losing all that detail. Ive seen the argument that 0-255 has a larger range than 16-235. It doesnt. Its just math. 0=16 and 255=235. Make sure your tv setting is 16-235 limited, and i would MANUALLY change your ps4 setting to match

http://www.reddit.com/r/PS4/comments/1r97bi/do_not_change_your_rgb_range_to_full_unless_you/
 
djskribbles might approve. :) AVS Forums, someone's input explains why Full Range seemed to enhance the contrast on my TV -'cos it adds saturation-. Very interesting reply overall, I bolded just part of it anyways.
It "adds saturation" because the difference in color channels is greater when you represent your video on the wider range. Who knows, maybe some screens do other processing, but the basic cause of increased saturation that will apply to all cases of sending a full-range signal to a limited-range display is the exact same mechanism that causes crushed blacks and clipped whites.

Imagine that you have R and G at middle grey level (128), and B at 200. When you shift to RGB full output, R and G are still at middle grey level, but blue gets shifted to somewhere in the ballpark of 210. Hence, more saturation if you're still interpreting the signal over the limited range.
 
If the RGB range setting on the XB1 is causing massive colour skewing, yeah, maybe DF's analysis would be more fair to the game if they used the "correct" setting.

Though it would also mean that Microsoft's console has a pretty badly broken video output system.


Viewing an image with a sketchy real-time 2.25x-per-axis upscale on a fixed-pixel display that probably has vastly worse static contrast than the display it was intended for "tells the truth about the graphics"?

That a modern LCD tends to horrifically butcher the image from a sixth-gen console does not mean that an SD CRT is somehow enhancing the image. The CRT is simply showing the image much less incorrectly.

If I go play Halo 3: ODST on my old Trinitron, luminances don't clamp the same way they do on my LCD, and the bold contrasts across the entire image (particularly on areas with alpha-blended layers) actually wind up looking pretty bad. Even independently of resolving fine detail and combing, the game looks better on my LCD.
That does not mean the LCD is "making the game look better than it should." It means my CRT is displaying the image according to color response properties (and resolution, etc) that the game wasn't designed around.

In any case, I maintain that your original point about console games artificially looking better than PC games due to the display technology is pretty hard to accept. There's not a lot of reason that an SD CRT would benefit a console game image to a greater degree than a desktop CRT would benefit a PC game image.
By correct I just mean that maybe Xbox One developers program the games with Limited RAnge in mind.

The console is meant to watch TV -generally yCrB content- and playing at the same time. Then the xbox.com FAQ recommends using Limited Range... It sounds logical to me.

On the CRT issue, you might have a point there. I didn't test that enough. With the evolution from the CRT SD era to the LCD HD current times what I observed is that old games that seemed to look ok on CRT were full of jaggies and shimmering on LCDs. That was a key difference for me.
 
By correct I just mean that maybe Xbox One developers program the games with Limited RAnge in mind.
If the video output system works correctly, there shouldn't be any systemic colour difference between the two ranges, as long as they're transmitted and received in a correctly-matched way.

If there's a difference between programming the game with limited range in mind, and programming the game with full range in mind, it means the video output system is broken.

On the CRT issue, you might have a point there. I didn't test that enough. With the evolution from the CRT SD era to the LCD HD current times what I observed is that old games that seemed to look ok on CRT were full of jaggies and shimmering on LCDs. That was a key difference for me.
The jaggies and the shimmering still show up on a CRT (I should know, I still use an SD CRT for old stuff), it just doesn't look as bad when there's less scaling-ish stuff happening, when the colours are represented correctly, and when phosphors are being used as the image reconstruction filter.

I certainly agree that sixth-gen games tend to look better on low-res CRTs than on high-res LCDs.
 
It "adds saturation" because the difference in color channels is greater when you represent your video on the wider range. Who knows, maybe some screens do other processing, but the basic cause of increased saturation that will apply to all cases of sending a full-range signal to a limited-range display is the exact same mechanism that causes crushed blacks and clipped whites.

Imagine that you have R and G at middle grey level (128), and B at 200. When you shift to RGB full output, R and G are still at middle grey level, but blue gets shifted to somewhere in the ballpark of 210. Hence, more saturation if you're still interpreting the signal over the limited range.
I don't find the energy to give you a meaningful reply today, but I am thankful for the input. Especially the last paragraph, which explains quite clearly what I noticed.

The colours looked more intense and vivid -the contrast between black and white too, but not 100% sure about that, it could be just a hunch of mine rather than something real- in games like Powerstar Golf, especially the main screen, where you see a lens flare effect featuring rainbow like colours.

I do have some experience on the effects of the RGB channels and how they affect colours, not only from 'cos of messing around with the Tint levels on the TV but by using the built-in calibration tool in Windows 8 for my PC display. Just not on how Full Range RGB and Limited Range can have a different effect on that.
 
Stacey Spears and Don Munsil are very educated guys in the video industry (yes Stacey is a guy), but keep in mind that their goal is to first and foremost optimize video playback, not so much video games or PCs. They use different (albeit similar) standards. Plus it makes it easier because not all displays support RGB Full, and there's little to no benefit in using RGB full over Limited if your display supports both. The setting is mostly there on consoles for compatibility purposes. Technically speaking, Limited is perhaps better for video, but Full is better for PCs and consoles. But as I have said over and over, there is little to no difference either way. The most important thing is that you match your levels.

In regards to the AVS quote, they are simply describing issues for when you send full range to a display that doesn't support full range. Every display I've owned that supported both, I have not had issues with my PS3 or PS4 when setting them to Full range. The display issues with RGB Full range appears to be an XB1 issue. So for now anyway, just use Limited. Only people with PC monitors that only support Full range will be affected by the black crush issue with Full range on XB1.

As for the reddit quote, it doesn't really make any sense. There's almost no difference between Full or Limited assuming everything in the chain is on the same page and assuming black/white are correctly mapped, which the PS3/PS4 do, but XB1 seemingly doesn't. Maybe it's because Sony knows a thing or two about video and displays since they make Blu-Ray players and TVs. X360 had issues with proper output as well.

edit: This isn't really related to this discussion, but I just want to point something out in regards to this quote from the AVS post:
If you replay a Blu-ray or DVD (which will be encoded 16-235) in RGB Full, then your PS3 will remap the 16-235 video to 0-255, which has the side effect of clipping any <16 or >235 content that may be present on the Blu-ray or DVD (sometimes known as Blacker than Black - BTB - and WTW - Whiter than White) (*)

(*) BTB and WTW in 16-235 video shouldn't contain picture content on properly mastered material - but they should ideally be preserved to allow overshoot and undershoot on sharp transitions (particularly in analogue sourced content) to be preserved without clipping to avoid ringing.
On the PS3, whether you use Limited or Full, BTB/WTW is clipped regardless. The only way to pass BTB/WTW on the PS3 is to use YCbCr output for Blu-Ray and enable super-white. This setting extends the range to 0-255 while keeping black at 16 and white at 235 so that you can see BTB/WTW. Blu-Rays and most video content is mastered to 16-235. Detail below 16 (BTB) should not be visible anyway, and detail above 235 (WTW) is mostly not visible to the human eyes anyway. But technically speaking, you should try to preserve BTB/WTW detail if possible, but it's not the end of the world if you can't.

Again, this has little to do with consoles/PCs, but more in regards to Blu-Ray/video playback/display calibration.
 
Last edited by a moderator:
What display do you have (brand, and if so, model as well). Displays as well as the PS4 have automatic settings for RGB range. My display's auto mode seems to be able to correctly pickup the RGB output range from the PS4, but it cannot correctly pickup the PS3's RGB output range. It's probably just a matter of setting up your display appropriately, if your display does in fact support RGB full range.

I have a LG LW5600, it has to be some really old display to not supporting RGB Full range properly. I basically have to create 2 profiles, one with correct black level for PS3 and 360 in full range and other for PS4 and WiiU, because Wii U doesn't even have the option for RGB full range and PS4 one just doesn't work. Its not good because I can only have 1 game profile for low input lag.
 
... because Wii U doesn't even have the option for RGB full range and PS4 one just doesn't work.

It does work with two known limitations, PS4 doesn't translate video levels for blu-ray playback and web browser bug, which makes calibrating via web pages (since PS4 doesn't have picture viewer as PS3) somewhat difficult.
 
I have a LG LW5600, it has to be some really old display to not supporting RGB Full range properly. I basically have to create 2 profiles, one with correct black level for PS3 and 360 in full range and other for PS4 and WiiU, because Wii U doesn't even have the option for RGB full range and PS4 one just doesn't work. Its not good because I can only have 1 game profile for low input lag.
I have LG displays older than yours that support RGB Range fine. The problem with LG displays is that their RGB range setting affects all signals/inputs when it should only affect RGB signals. This could cause conflicts with devices that share an HDMI input.

Not sure where you're located, but on North American LG displays, the RGB range setting is called 'HDMI Black Level' or simply 'Black Level', and the options are Low (16-235/Limited) and High (0-255/Full).
 
I don't know what dev optimize for on xbox one! I wish tho they would push limited rgb ! it would make everything simpler !everything use limited but pc!
 
Stacey Spears and Don Munsil are very educated guys in the video industry (yes Stacey is a guy), but keep in mind that their goal is to first and foremost optimize video playback, not so much video games or PCs. They use different (albeit similar) standards. Plus it makes it easier because not all displays support RGB Full, and there's little to no benefit in using RGB full over Limited if your display supports both. The setting is mostly there on consoles for compatibility purposes. Technically speaking, Limited is perhaps better for video, but Full is better for PCs and consoles. But as I have said over and over, there is little to no difference either way. The most important thing is that you match your levels.

In regards to the AVS quote, they are simply describing issues for when you send full range to a display that doesn't support full range. Every display I've owned that supported both, I have not had issues with my PS3 or PS4 when setting them to Full range. The display issues with RGB Full range appears to be an XB1 issue. So for now anyway, just use Limited. Only people with PC monitors that only support Full range will be affected by the black crush issue with Full range on XB1.

As for the reddit quote, it doesn't really make any sense. There's almost no difference between Full or Limited assuming everything in the chain is on the same page and assuming black/white are correctly mapped, which the PS3/PS4 do, but XB1 seemingly doesn't. Maybe it's because Sony knows a thing or two about video and displays since they make Blu-Ray players and TVs. X360 had issues with proper output as well.

edit: This isn't really related to this discussion, but I just want to point something out in regards to this quote from the AVS post:

On the PS3, whether you use Limited or Full, BTB/WTW is clipped regardless. The only way to pass BTB/WTW on the PS3 is to use YCbCr output for Blu-Ray and enable super-white. This setting extends the range to 0-255 while keeping black at 16 and white at 235 so that you can see BTB/WTW. Blu-Rays and most video content is mastered to 16-235. Detail below 16 (BTB) should not be visible anyway, and detail above 235 (WTW) is mostly not visible to the human eyes anyway. But technically speaking, you should try to preserve BTB/WTW detail if possible, but it's not the end of the world if you can't.

Again, this has little to do with consoles/PCs, but more in regards to Blu-Ray/video playback/display calibration.
I have heard this a couple of times over the years, can you explain more on this? I will post my findings for the ps4 later today.
 
BTB/WTW is more related to Blu-Ray/video than games/PCs. Blu-Rays are mastered using a range of 16-235 where 16 is reference black and 235 is reference white. Unlike RGB Limited, the range isn't so much compressed, but rather cut off at those levels. Therefore, there is additional picture information below 16 (BTB) and above 235 (WTW). However, since Blu-Rays are mastered to 16-235, levels below 16 shouldn't be visible, and levels above 235 are typically not seen; they're just spectral highlights like chrome reflections or sparkles etc., there's no actual 'detail' there.

Enabling the PS3's super-white setting essentially opens the full range so that you can see BTB/WTW information, while disabling it essentially clips the range at video reference black (16) and video reference white (235). Seeing the full range helps with setting the Brightness (black level) and Contrast (white level) on your display. When it comes to calibrating for video, typically you don't want to see anything at level 16 and below, but you want to avoid clipping white detail completely, or at the very least you want to avoid clipping up to 235.

I'm not exactly sure what the PS3 does when it converts to RGB for Blu-Ray, but there's no way to view BTB/WTW information with RGB output. I'm guessing it either simply maps reference video black/white and ignores BTB/WTW information; or it converts YCbCr to RGB Full Range and BTB/WTW information is lost, then compresses to RGB Limited if selected. But again, because Blu-Ray is mastered to 16-235, it's not the end of the world if you clip BTB/WTW.

The PS4 is different from the PS3 in the way it handles Blu-Ray -- it always outputs YCbCr, and it appears to always output BTB/WTW. I Still don't know what the YCbCr range setting does.

When it comes to game consoles, assuming the mapping of the RGB range is done properly, and everything in the chain is matched, then nothing is being clipped. It's either outputting full range 0-255 RGB, or it will be compressed to limited range 16-235 RGB. The PS3/PS4 output proper levels, but the XB1 seems to have issues outputting RGB full range at the moment. The X360 also outputs an unusual gamma curve AFAIK.
 
Last edited by a moderator:
BTB/WTW is more related to Blu-Ray/video than games/PCs. Blu-Rays are mastered using a range of 16-235 where 16 is reference black and 235 is reference white. Unlike RGB Limited, the range isn't so much compressed, but rather cut off at those levels. Therefore, there is additional picture information below 16 (BTB) and above 235 (WTW). However, since Blu-Rays are mastered to 16-235, levels below 16 shouldn't be visible, and levels above 235 are typically not seen; they're just spectral highlights like chrome reflections or sparkles etc., there's no actual 'detail' there.

Enabling the PS3's super-white setting essentially opens the full range so that you can see BTB/WTW information, while disabling it essentially clips the range at video reference black (16) and video reference white (235). Seeing the full range helps with setting the Brightness (black level) and Contrast (white level) on your display. When it comes to calibrating for video, typically you don't want to see anything at level 16 and below, but you want to avoid clipping white detail completely, or at the very least you want to avoid clipping up to 235.

I'm not exactly sure what the PS3 does when it converts to RGB for Blu-Ray, but there's no way to view BTB/WTW information with RGB output. I'm guessing it either simply maps reference video black/white and ignores BTB/WTW information; or it converts YCbCr to RGB Full Range and BTB/WTW information is lost, then compresses to RGB Limited if selected. But again, because Blu-Ray is mastered to 16-235, it's not the end of the world if you clip BTB/WTW.

The PS4 is different from the PS3 in the way it handles Blu-Ray -- it always outputs YCbCr, and it appears to always output BTB/WTW. I Still don't know what the YCbCr range setting does.
Yeah strange i just checked and the ycbcr range setting doesn't do anything, i am getting white values higher than 235 with either limited or full. Shame you can't switch to rgb when viewing movies. There is a way to watch rgb limited/full videos, and it's through the internet browser using a dlna program called plex media server. I downloaded the mp4 avs 709 black/contrast test pattern videos on my laptop and managed to stream them through the ps4 browser.

The result is that the contrast clips at 235 both on limited and full. So it's just like the ps3. Does the ps4 actually output true 0-255? I heard that the ps3 did not actually output full rgb even when selected, and that it just brightened 16-235 to simulate 0-255? I'll have to search it again.
 
I don't know why is there so much fuss about BTB/WTW.
It's not like you are supposed to see anything from that range and if 16 is black on your display than you will not see anything from BTB anyway.

Does the ps4 actually output true 0-255? I heard that the ps3 did not actually output full rgb even when selected, and that it just brightened 16-235 to simulate 0-255? I'll have to search it again.

It doesn't make sense, why limit range on game rendering level when almost every display can take advantage of Full RGB range.
I'm aware of fact that limited and full looks almost the same on properly set-up display, but from mathematical point of view limited have some 5537792 less colors than full, so is more likely to see banding in picture.
 
I don't know why is there so much fuss about BTB/WTW.
It's not like you are supposed to see anything from that range and if 16 is black on your display than you will not see anything from BTB anyway.



It doesn't make sense, why limit range on game rendering level when almost every display can take advantage of Full RGB range.
I'm aware of fact that limited and full looks almost the same on properly set-up display, but from mathematical point of view limited have some 5537792 less colors than full, so is more likely to see banding in picture.
Maybe they limit consoles to rgb limited to save processing power? Not sure, but here is the source for my other comment:
They are to do with support for legacy RGB DVI displays which support only 0-255 RGB levels and not the broadcast/studio standard of 16-235 RGB levels which almost all HDMI displays should support, and is the levels standard used for DVDs and BluRays.

Many displays will cope with either setting and deliver near identical results - though will need re-calibration depending on which one you chose. If you do not recalibrate, then FULL will appear to have deeper, crushed blacks and thus more saturated colours, and brighter whites. However with correct calibration on a studio level capable display there should be no difference. Additionally Limited levels will allow blacker-than-black and whiter-than-white to be passed through, whereas Full levels can't allow this as anything below 16 or above 235 in studio levels terms is mapped to 0 or 255 in Full (aka PC level terms)

In FULL mode the source - which is 16-235 - is remapped to 0-255 - so blacks which are encoded on the BluRay source at level 16, are not output at 16 and instead reduced to 0. Whites which are encoded on the BluRay source at level 235 are scaled to hit 255. Anything below 16 or above 235 (so called Blacker than Black and Whiter than White information) is clipped in FULL mode - it is NOT passed.

The key thing to understand is that broadcast TV, DVDs, HD-DVDs and BluRay are mastered with black at 16 and White at 235 (whether RGB or YCrCb representation are used - and Cr Cb are 16-240 centred around 127) These are known as studio or broadcast levels - and have a narrower black-to-white range to allow for below-black and above-white excursions to be carried without clipping - which is an important issue when you are mixing analogue and video sources (Transitions can cause spikes in analogue circuitry that will go past black and white levels, if these are clipped, they will cause ringing - i.e. artificial black/white edge distortion - when converted back to analogue.)

The fact that FULL and LIMITED are not simply different ways of displaying a signal with the same range - as you suggest - is clearly visible when you flip between modes - as in FULL mode the black level drops and white level increases. This is NOT what would happen if the switch was simply between passing <16 and >235 or not and keeping black at 16 and white at 235 - you would get no black or white level shift. But you do.

Super White is the option that allows whiter-than-white to be passed - not FULL.

FULL is simply an option Sony added to remap 16-235 studio levels to the older DVI RGB standard (previously uncatered for in PS3) using PC levels of 0-255. It is NOT to do with passing blacker than black or whiter than white - as it clips <16 and >235 levels in the remap process. This is important for projectors and owners of older HDTVs with DVI inputs added for use with PCs rather than video sources.

It is a pity Sony chose FULL and LIMITED as descriptions - as it implies FULL offers better results and LIMITED is somehow inferior. (I'd have thought STUDIO and COMPUTER levels might have been better)"
http://www.neowin.net/forum/topic/622671-rgb-fulllimited-discovering-truths-and-calibrating-your-tv/
Yes i totally agrre, full is better than limited but is the ps4 displaying it correctly? I tested the lagom black level pattern via the ps4 browser, quickly switching between RGB limited and full:
ioKIG8UXsB61g.png


I do notice a slight difference on square 255. When in Limited, the white is a little muted. When in full, the white is a little brighter.

Before anyone asks, yes my tv supports rgb limited/full so i don't get any black crush on any of them.

Also what about this?

-Limited-
ihLJrdDkjoWDu.png


-Full-
iblxrv225msKIe.png


You shouldn't be able to tell the difference between limited and full, yet the bottom image is visually darker. Is the ps4 outputting rgb full correctly for games?
 
Maybe they limit consoles to rgb limited to save processing power?
Nope. Computer graphics work in 8 bits per pixel for output framebuffers. Converting to limited range actually adds a little effort to rendering.


Also what about this?

You shouldn't be able to tell the difference between limited and full, yet the bottom image is visually darker. Is the ps4 outputting rgb full correctly for games?
When viewed on the same display, you will see a difference. If the display you are viewing them on is set to limited, the bottom pic will have black crush. If viewing on a full RGB display (like this PC), the top pic will lack contrast. They will only look the same when viewed side-by-side on two different displays with two different outputs.
 
Yeah strange i just checked and the ycbcr range setting doesn't do anything, i am getting white values higher than 235 with either limited or full. Shame you can't switch to rgb when viewing movies. There is a way to watch rgb limited/full videos, and it's through the internet browser using a dlna program called plex media server. I downloaded the mp4 avs 709 black/contrast test pattern videos on my laptop and managed to stream them through the ps4 browser.
The only time you really need to switch to RGB for movies is if you're using a monitor that only supports full range. YCbCr is ideal for Blu-Ray 99% of the time.

The result is that the contrast clips at 235 both on limited and full. So it's just like the ps3. Does the ps4 actually output true 0-255? I heard that the ps3 did not actually output full rgb even when selected, and that it just brightened 16-235 to simulate 0-255? I'll have to search it again.
It outputs true full range AFAIK.
 
djskribbles I actually love you.

What you explained pretty much gives confirmation to a general 'feel' I had - on my TV at least, I get a punchier picture when playing Blurays than I do when playing games. It's undeniable that the way my display shows an RGB signal and a YCbCr one is very different.
 
The only time you really need to switch to RGB for movies is if you're using a monitor that only supports full range. YCbCr is ideal for Blu-Ray 99% of the time.

It is ideal only because that is how blu-ray's are encoded, so with YCbCr you get original without any conversion.

I quite like how PS3 does conversion to Full RGB with chroma upsampling and actually respects what user set as output format.
 
Back
Top