Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
Doesn't limited range crush information to begin with? If the range is 16-235 We had a detailed discussion about this a while ago, but I can't remember exactly.
 
Not again with this full/limited discussion!
Limited doesn't crush anything, it's just the way the system and the TV map the range. If they match, no crush will occur. It's when they don't match that issue start appearing.
 
Not again with this full/limited discussion!
Limited doesn't crush anything, it's just the way the system and the TV map the range. If they match, no crush will occur. It's when they don't match that issue start appearing.

It doesn't change the full "contrast" of colours that can be displayed by the display device but it sure reduces the precision somewhat, as you have less integers to represent your colour.
 
Scott, what kind of display do you have? This has never been easy enough for consumers because the terminology between manufactures and what settings do what varies so wildly. I assume your Xbox is set to "TV Color Space" (ie. limited) but how do you know the TV is also set correctly, is there a limited/full rgb setting? On my TV the option which controls that is called "black level" which seems retarded to me. The other factor at play here is gamma, and so if you do have the range matched, your gamma setting could still be off. If you have those right, and then calibrate brightness/contrast/etc. you should be in good shape. I don't have any problems with the titles you mentioned.

I have a Samsung display. From what I've understood, "HDMI black level: Low" should be limited range. How do you match the gamma? Is that a calibration thing, or is there some other setting I haven't discovered? I've turned off all the dynamic contrast stuff, or post processing stuff as recommended. My tv is in game mode.

Edit: This whole thing is annoying because it triggers some kind of OCD in me that wants to make sure everything is set correctly. It's like this little nagging doubt.
 
cant the TV, console, and capture card simply talk to each other using HDMI CEC?

Console: Hey TV, im running on Limited RGB.
TV: Roger!
This doesn't even require CEC (mostly remote control pass through), negotiating optimum audio and video settings between content provider and content display/capture is basic HDMI. I've always set my devices to 'auto' and let the hardware sort it out. That way games and Blu-ray Discs automatically look right, including discs that support YCbCr.
That is not correct. There are no components of hdmi signaling responsible for negotiatiing gamma curve or rgb ranges to use. It only specifies for the color space and depth to use.

We were discussing limited/full RGB. But just to clarify, by "basic HDMI" I mean this is something the device providing the content and the display/capture device should resolve following negotiation, which is where display (and capture) device declare what format it can a) support and b) accurately display. The device outputting the content should be the driver (i.e. decider) based on what it knows the display can handle.

Lots of TVs can accept a BT.601/xvYCC colorspace but will convert to the equivalent to BT.709 for display because the displays don't have the range These extended gamuts are incorporated in HDMI 1.3 and above but most Sony devices support them because Sony proposed this HDMI extension standard back in 2005.

orangpelupa's point/plea was capture cards should know when to capture limited/full RGB so as to not to introduce things like black crush when it wouldn't appear on a TV - at least I think that was his sentiment. I assume capture cards operate like a HDMI display in terms of negotiation/setup but they may not be that sophisticated.

Like others I also don't want to revisit the horror of the limited vs full RGB "discussion", and I use that word in it's loosest sense :yep2:
 
Last edited by a moderator:
It doesn't change the full "contrast" of colours that can be displayed by the display device but it sure reduces the precision somewhat, as you have less integers to represent your colour.

On any halfway decently calibrated display the difference should be barely, if at all, perceptible.
 
I have a Samsung display. From what I've understood, "HDMI black level: Low" should be limited range. How do you match the gamma? Is that a calibration thing, or is there some other setting I haven't discovered? I've turned off all the dynamic contrast stuff, or post processing stuff as recommended. My tv is in game mode.

Edit: This whole thing is annoying because it triggers some kind of OCD in me that wants to make sure everything is set correctly. It's like this little nagging doubt.

I don't think you can adjust the gamma curve without tinkering with the tv's service mode settings (which isn't something you should be doing unless you have in-depth knowledge of color calibration, the internal workings of your tv, and proper tools) . No idea what kind of Samsung tv you have, but my brother and I both have Samsung Plasmas which were calibrated damn near perfectly out of the box.
 
What display do you have? Samsung TVs support full RGB and wide color gamut since 2006 or so.
 
No idea what the name of the thing is precisely. I have Samsung's top of the line Plasma from 2009. My brother has a lower end model from 2011 (which is at least as good in terms of basic IQ plus it has 3d which beats the pants off of any LCD tv's 3d I've seen so far. The only thing my tv has on his is the anti reflective coating and a more pleasnt form factor) I think.
 
What display do you have? Samsung TVs support full RGB and wide color gamut since 2006 or so.

Mine supports full RGB and wide colour gamut, I just left it as limited range which is the default when you turn it on. The Xbox One calibration tool recommends limited range for tvs.
 
Don't believe that. Always use full when possible!

If I remember correctly, imagine it like this. Game's graphics eventually end up in GPU memory in something like RGB. These four values each have range of 0-255. If your TV only supports 0-255, these values need to be compressed into 16-235 encoding range. So here, information is compressed. The amount of combinations of 219^3 is lower than 256^3. Then the TV receives it as 3*16-235 values, but can display 0-255, or as is often the case these days, more than that (Wide Color Gamut etc.) So the information is translated again to 0-255 and although less information is lost, it still happens due to not being a proper match (it's not, say, a x2 translation).

Then there is the Deep Color support that allows color information to be output in a higher resolution than 0-255 already, like 12bit per color, and TVs can accept that too. But if you choose limited range, you not only disable that, but decrease from 8bit.

I'm sure I'm not 100% correct, but close enough I hope. I tested it on mine and already with my older TV I could see more color banding and less deep blacks.
 
Mine supports full RGB and wide colour gamut, I just left it as limited range which is the default when you turn it on. The Xbox One calibration tool recommends limited range for tvs.

By support, do you mean it 'understands' those colorspaces or do you mean it both understands those colorspaces and the display is actually capable of reproducing them?

I would definitely expect non-budget Samsung TVs, going back quite a few years, to be able to reproduce extended gamut colorspaces on the panel but there are plenty of TVs that can accept the signal but does a conversion job because the panel isn't actually able to reproduce the wider gamut.
 
By support, do you mean it 'understands' those colorspaces or do you mean it both understands those colorspaces and the display is actually capable of reproducing them?

I would definitely expect non-budget Samsung TVs, going back quite a few years, to be able to reproduce extended gamut colorspaces on the panel but there are plenty of TVs that can accept the signal but does a conversion job because the panel isn't actually able to reproduce the wider gamut.

No idea how I'd be able to tell. It's a fairly new Smart tv. I think it's 2 years old.
 
No idea how I'd be able to tell. It's a fairly new Smart tv. I think it's 2 years old.
You're probably good then :yes:

But I make this point to illustrate that just because a TV "supports" a particular video stream type of colorspace format, doesn't mean the panel can display it. It's like 1366x768 panels accepting a 1080p input but the panel not being able to reproduce it.

I started to pay a lot more attention to display tech shortly after Apple were exposed for shipping 6-bit TN panels on certain Mac models and claiming they could display 16.7m colours. Technically yes, but not by any objective technical sense :nope:
 
Don't believe that. Always use full when possible!

If I remember correctly, imagine it like this. Game's graphics eventually end up in GPU memory in something like RGB. These four values each have range of 0-255. If your TV only supports 0-255, these values need to be compressed into 16-235 encoding range. So here, information is compressed. The amount of combinations of 219^3 is lower than 256^3. Then the TV receives it as 3*16-235 values, but can display 0-255, or as is often the case these days, more than that (Wide Color Gamut etc.) So the information is translated again to 0-255 and although less information is lost, it still happens due to not being a proper match (it's not, say, a x2 translation).

Then there is the Deep Color support that allows color information to be output in a higher resolution than 0-255 already, like 12bit per color, and TVs can accept that too. But if you choose limited range, you not only disable that, but decrease from 8bit.

I'm sure I'm not 100% correct, but close enough I hope. I tested it on mine and already with my older TV I could see more color banding and less deep blacks.

Nope - because of how inconsistent settings are full will cause more problems.
Nope - the in memory representation gets converted to LUV either way. So 16-235 or 0-255 will contain the same breadth since below 16 and above 235 values are possible even in "limited" space.
Nope - choosing "limited" or TV space does not disable or affect deep 10 or 12 bit color.
Nope - if you see banding it's for some other reason.
 
I have a Samsung display. From what I've understood, "HDMI black level: Low" should be limited range. How do you match the gamma? Is that a calibration thing, or is there some other setting I haven't discovered? I've turned off all the dynamic contrast stuff, or post processing stuff as recommended. My tv is in game mode.

Edit: This whole thing is annoying because it triggers some kind of OCD in me that wants to make sure everything is set correctly. It's like this little nagging doubt.

Understandable. Black level low should indeed be limited. Your TV should have a gamma setting menu option. In advanced or details or somewhere. If you think it's crushing blacks then you should see if you can lower the setting. I would bring up the greyscale ramp on the xbox calibration screen and make sure there isn't a short range or defined line when things go from grey to black. It should be a smooth gradient between both white ticks. If you get your model number we can google for gamma settings.

The only way to guarantee the output is correct is to have it measured but usually you can get it pretty close on your own. Good luck!
 
On most Samsung TVs: Menu -> Picture -> Advanced Settings to access gamma settings.
 
I don't think you can adjust the gamma curve without tinkering with the tv's service mode settings (which isn't something you should be doing unless you have in-depth knowledge of color calibration, the internal workings of your tv, and proper tools) . No idea what kind of Samsung tv you have, but my brother and I both have Samsung Plasmas which were calibrated damn near perfectly out of the box.

There really is no such thing as perfect out of the box calibration because the calibration implies matching the output settings to the input settings, and since the whole problem with this video crap is that those input settings vary wildly from device to device, the display can't be tuned. Just best guesses can be made.

Ipads, tablets, laptops, all-in-ones, etc. that you see reviewed with excellent factory calibration are not good comparisons since the video signal source is defined ahead of time. This is completely different from a TV or other display that is going to get hooked up to an unknown signal source.
 
Understandable. Black level low should indeed be limited. Your TV should have a gamma setting menu option. In advanced or details or somewhere. If you think it's crushing blacks then you should see if you can lower the setting. I would bring up the greyscale ramp on the xbox calibration screen and make sure there isn't a short range or defined line when things go from grey to black. It should be a smooth gradient between both white ticks. If you get your model number we can google for gamma settings.

The only way to guarantee the output is correct is to have it measured but usually you can get it pretty close on your own. Good luck!

Thanks for the help. Sent you a pm, because we're in derail territory.

If you own an xbox one, and you haven't done some other form of calibration, use the built in tool in the display settings and follow this advice I got from Rockster:

As you "next" through the calibration tool, you'll get to a screen called brightness with a couple eyeballs and a grayscale ramp down the right hand side of the screen, with a couple white ticks marking either end. That gradient should be smooth throughout between the ticks. That ramp has both a smooth gradient side on the right, and avg by region on the left. Pay attention to the blacks up top. The brightness should change by the same amount between each region. If you see sort of a sharp divide in the smooth gradient where it goes from grey to black very quickly, then your gamma is too low. You also don't want it too be too dark all the way down the screen, gamma too high. Light intensity should vary by equal amounts in each region. After you have that correct, the adjust your brightness control going to the notch which just causes the top eyeball to disappear. You might need to readjust after setting contrast and find what you like best, but the most important thing is getting gamma to ramp correctly.

Made a huge difference for me. You just need to find out if your tv allows you to adjust gamma.
 
Face-Off: Call of Duty: Advanced Warfare

Call of Duty: Advanced Warfare - the Digital Foundry verdict

Given more time to closely scrutinise Advanced Warfare across a breadth of stages, the tussle between PS4 and Xbox One is an easy one to summarise. Owners of Sony's latest kit enjoy a pure, true, unadulterated 1080p image throughout the game, planting its flag closest to the PC's pristine standard of presentation. The one downside for this version is its propensity for frame-rate drops in campaign mode - wavering between 50-60fps under load, and stuttering very occasionally lower. The Xbox One, meanwhile, handles this solo mode with far fewer dips, albeit with the introduction of tearing at stress points.

This does not have a bearing on its multiplayer mode, where both PS4 and Xbox One are optimised to hold 60fps at a consistency we'd expect of the series. Odd, single frames are skipped every now and then while jet-boosting around taxing stages like Instinct - but these are largely imperceptible blips on an otherwise straight 60fps line. If you only have eyes for multiplayer, either platform comfortably satisfies in the frame-rate stakes.

As a detracting point for Microsoft's platform, the resolution is often at the 1360x1080 point in campaign, only really rising to a full 1920x1080 outside of battle. This dynamic framebuffer doesn't translate as we'd expect to multiplayer either, where it's fixed to the lower number in perpetuity - resulting in a cut-off in image clarity the further into the distance you look. For us, this would be one of the bigger points of consideration, but it's not so much of an issue in the single-player campaign, owing to its more heavily post-processed image.

However, the PS4 and Xbox One each share the highest quality textures, effects and geometry of the maxed-out PC version. They only fall noticeably short in three areas; the low-grade anisotropic filtering on textures, less accurate specular mapping for reflections, and opting for SSAO, rather than the PC's subtler HBAO+ shading around objects. Otherwise, you get the full deal, complete with subsurface scattering.

Overall, the PS4's superior image quality compared with Xbox One makes it the choice pick on the multiplayer front, with both holding up here at 60fps. As for the campaign mode's playability, it's an apples and oranges contest between the Xbox One's performance lead and the PS4's resolution advantages, with little else in-between. But for those equipped to do so, the PC version is a tantalising alternative that deserves respect for putting the series back on track.
 
Status
Not open for further replies.
Back
Top