1080p HDR image better than 4k non HDR ?

Having bought a 'clever' WRGB display Samsung Note Pro and seen how shitty the yellows are, I'm mistrustful of clever solutions that skirt around having the real colour solutions. All the claims for WRGB were quite frankly bollocks and lies because they had to sacrifice the screen's ability to display content properly. Mention 'additional white' OLEDs and I'm immediately thinking low saturation to the point of the colour being useless.

Basically, all the tech speak in the world means nothing. I need to see a screen in the flesh to know whether it's good or not, and not some amazing science-speak.

The other point that would be surprising if you weren't worldly-wise is that it seemed OLED would be an ideal solution - individual light sources for each primary - but its never that easy. Things like LED half-life means the simple ideal is never realised and panel makers need to jump through hoops. So whenever you come across a Scientific American or New Scientist article extolling a fabulous new tech that's just around the corner and will be better and cheaper and faster, scrunch it up and chuck it in the bin with the methanol fuel cells that power your mobile devices and the SED TVs. The only thing these dream techs are really good for are fleecing investors.

Whoa, now we're going into a different territory! Your 'clever' = not my 'clever' What you own is actually a 'RGBW' OLED, not 'WRGB'. Despite the confusing similiarity, they are completely different!

Pentile is a completely different monster. They are much, much more problematic because they actually try to skirt around reduced resolution. The LG OLED TVs do not. They use 4 subpixels for every pixels, all the while having 8 million pixels for "full" 2160p resolution. But pentile uses extra subpixel trying to create extra resolution.

http://www.displaydaily.com/display...-out-for-ultra-hd-tvs-with-reduced-resolution

There's a benchmark done with LG's RGBW pentile IPS LCDs and you can see it's failed to pass every tests, whereas LG WRGB OLEDs passed every tests with flying colors!

http://www.displaydaily.com/display-daily/32947-resolved-no-more-dot-counting

Then, the inventor of pentile comes in and critisizes him.

Can't say I disagree with everything she says though. I also own a Samsung plasma TV that uses pentile. While it's no good for text work, and compatibility with games were hits and misses, for general TV/movie contents, it was pretty decent. Since today's HD broadcasts and streaming has poor bitrate, pentile actually helps masking bitrate deficiencies. While my pentile plasma has about the same subpixels as a 720p TV, it still looked much closer to 1080p than 720p. (and yes. I also own a 720p Samsung plasma. I just can't get enough of my plasmas!)
 
BTW, MicroLED will not be the ultimate solution either. Since it uses LEDs, it will have same restrictions as current LED backlight TVs = a need for a color filter in order to access larger color gamut.

https://www.osapublishing.org/oe/viewmedia.cfm?uri=oe-23-25-32504&html=true

This paper is describing a method to implement Quantum Dots as color filter to increase color gamut by 1.5X. This is also one of the methods Samsung is contemplating on using for their QLEDs. Then, this paper also mentions even inkjet printing would not be ideal and we need to be using much harder aerosol jet printing. Going around such hoops only to be not as good as RGB OLEDs PQ-wise?

http://www.asahi-optics.com/news/36.html

Another strike against Micro LEDs, a larger TVs will be also extremely difficult to make. Making a color Micro LED is also a challenge due to the differences in each RGB LEDs. This is why Apple is only trying to mass produce it for their Apple Watch line which is both very small and lacks color. There's a reason Sony's CLEDIS costs 6 figures.

So, currently, we're in a tough situation.

Samsung's RGB = Provides the best PQ (No color filters, natively obtained color gamut is very high, has very low ABL: can provide over 400 cd/m2 in 100% full white) but lifetime becomes forever in issue and it can't be made for bigger TVs because LTPS FMM bends too much when trying to make a bigger panel. (Samsung has learned their lesson hard for their 55 inches OLED TV. Production costs were 3 times that of LG's WOLED.)

Micro LED = Solves lifetime issues, very bright yet extremely power efficient. But there are a lot of hurdles for mass production. Has difficulty making a bigger sized TVs. Trying to make it produce color is another complication. There's very little headroom for yields. Has extremely low fill factor, meaning RGB subpixel size is very small, so Screen Door Effect will be most severe on it and Sony's CLEDIS was no exception to it too.

Soluble OLED = Same RGB structure as Samsung's current mobile OLED, so PQ will be also excellent. Inkjet printing has much lower cost than current vapor deposition based OLEDs, so has a potential to be cheaper than LCDs. Unfortunately, it will be even worse for burn-in because blue lifetime will be reduced from 20,000 hours to 8,000 hours.

LG/Kodak's WOLED = Relatively free from burn-in. Larger sized TV production is possible by using IGZO, a form of oxided backplane. Production cost is the cheapest save for inkjet printing. Inefficient due to use of color filters. (uses 250W yet only nets 150 cd/m2 at 100% APL) Screen door effect is present due to use of 4 subpixels, (less space for RGB subpixels) but still much better than Micro LED in this regard.

Out of the 4, one can see why LG's WOLED solution is the best interim solution.
 
I just caught this bit from the Eurogamer article ( http://www.eurogamer.net/articles/d...-us-remastered-patch-108-for-ps4-pro-analysed ) and wondered if maybe it's the way to go for those without 4K displays. However the price of the device ($199) seems to be a good part of what a new display would cost, but it might be cheap enough for some to give it a go.


HD Linker: accessing 4K-exclusive Pro features on 1080p screens

Some players with 1080p displays have asked if there is any way to access PS4 Pro game modes only available on 4K screens - specifically down-sampling modes like those stripped from TLOUR. Yes there is, we've tested it and it works - but it is somewhat expensive. You'll need a device known as the HD Linker. It's designed to enable UHD functionality on 1080p screens but that's just one of its many functions.

Users connect the HD Linker between the Pro and a 1080p display and it's powered via USB. The Pro sees that it's attached to an HDCP 2.2-compliant display and the Linker effectively downscales the Pro's 4K output to full HD. We've tested it on a Pro attached to a Panasonic VT20 plasma display and it works just fine - 4K functionality is accessible from the video output settings menu as per normal, even though a full HD screen is attached.

As the Pro believes it is attached to an HDCP 2.2-compliant screen, you can also access non-gaming features too - such as the ultra HD encodes on Netflix, giving an almost Blu-ray-like presentation. At $199, it's effectively half the cost of the console itself so we mean it when we say that it's expensive - but it does do the job.
 
That's an excellent solution! If your goal is supersampled 4K output from PS4 Pro no matter what, that will certainly be plausible. Ditto with UHD streaming services such as Netflix, M-Go, and Ultraflix. Until now, 1080p owners were locked out of UHD services simply because they weren't 4K. This is ridiculous, as I've watched tons of UHD demos downsampled to my 1080p plasma and it was still vastly superior to any 1080p Bluray movies. Did you know Breaking Bad's UHD Netflix version is not a true 4K? They actually used HD Bluray version and upscaled it into 4K and called it that. Resolution is NOT the most important factor when it comes to quality of video. It's actually bitrate! Hollywood owns tons of 2K masters that are 1 terabytes each, and these bad boys will cream any UHD streams from Netflix. Youtube is doing the most sensible thing. They're offering different bitrate quality selections, but they actually call it resolution, which is confusing people. Don't worry and just choose the highest resolution regardless of the resolution of the display you own. I've tried playing Youtube 4K files on my 480i Sony Wega CRT TV, and it STILL showed an improvement over next best (1080p) option. This is the also the reason audio clarity improves when you select higher resolution on Youtube.

However, that solution comes with one serious limitation, converting PQ EOTF into gamma, and converting Rec.2020 into Rec.709. Unfortunately, 100% of current UHD Blurays are designed to run in HDR, by default. What should be done to accomodate non-HDR owners? By performing a conversion, and it's not pretty. Tons of Samsung UHD Bluray player owners have complained its colors simply do not look right on their non-HDR UHD TVs. Because PQ EOTF is a complete departure from gamma, backwards compatibility remains a problem. Right now, there are extremely few HDR offerings from Netflix, but Netflix has promised to increase it, and when that day comes, color problem will become more common. Not completely related, but it's same thing with games too. HDR games have settled with Rec.709 for now because that will ensure backwards compatibility with non-HDR TVs. Once HDR games go DCI-P3 color gamut, that compatibility will be lost.

Solution? Use 3D gamut along with dynamic metadata. Together, they will allow for real time contraction of DCI-P3 wide gamut into Rec.709 narrow gamut. But as mentioned previously, 3D gamut has not been introduced yet save for a few Dolby Vision TVs, so we have to wait for either HDMI 2.0C or HDMI 2.1 devices that will officially have a support for 3D gamut and dynamic metadata. It's kind of weird for HDMI 1.4 display owners needing HDMI 2.0C/HDMI 2.1 ecosystem in order to have improved backwards compatibility, but it's true.
 
Last edited:
What's the cheapest 4K TV available in europe with real 10bit panel? The smaller the better (well, let's not go under 40" but 40" exactly would be optimal)
 
That depends on the definition of "real 10bit". If you're simply after native 10bit panel, then I'll put out the list.

Panasonic : Only the top-end UHD Premium DX902 supports native 10bit panel. DX750 and below, they use 8bit panel.

"Unlike the top-tier DX902 however, the DX750 is not Ultra HD Premium-certified due to lack of native 10-bit panel and inability to hit 1000 nits peak brightness" (HDTVTest.co.uk)


LG : The entire UHD lineup. UH6100 and above.

"The color gradient is pretty good for an 8 bit panel. There are not any banding issues that we can see from our gradient pattern. The only little imperfections that you can see when looking at the picture are more related to the gray uniformity.

Update 09/30/2016: Our original test was showing an incorrect color depth of 8 bit due to some incorrect drivers on our system, but after some correction to our test apparatus, we tested again the color depth and we can confirm that the LG UH6100 does in fact have an 10 bit panel." (RTings)


Sony : The entire UHD lineup. X700D and above.

"The Sony X700D has a good 10 bit panel and the color reproduction is pretty even. There is not really any obvious banding problem that can be seen on our test picture that could cause a problem when watching movies." (RTings)

Samsung : The entire UHD lineup. KU6300 series and above

"The Samsung KU6300 can display our gradient test image fairly well. On our test picture, the gradation is smooth overall in the light shades with some small anomalies in the darker shades, especially in the green color. But it should not be an issue in regular content.

Update 10/26/2016: Our original test was showing 8 bit gradations due to incorrect drivers on our system. After some correction to our test apparatus, we have retested the color depth and found that it is able to display a 10 bit gradient smoothly."


However, I would take RTings interpretation with a huge grain of salt. They say it's due to driver error, but their Samsung KS series reviews at that time still were able to resolve 10bits. PQ EOTF is efficient that 8bit + FRC display can still pack up to 400 nits without banding. Here's HDTVTest's results.

"Although greyscale was flattened after calibration, minor banding and discolouration remained visible on a grey ramp pattern, suggesting that the underlying panel wasn’t true 10-bit." (HDTVTest LG UH7700 review)

So, I'm more inclined to believe only Samsung's SUHD lineups are true 10bits, and for LGs, only the OLED TVs. I'm more inclined to believe Sony's results though as for last years models, they were the only ones along with Panasonic that were able to resolve true 10bit signal when fed a signal generator. The Samsung JS SUHD series and 2015 LG OLED TVs, despite having true 10bit panel were only able to resolve 8bit signal.

That doesn't mean Samsung KS series are home free either. Because they've decided on 4000 nits, tone mapping system, compression was so severe, posterization has appeared in a form of banding.

"Unfortunately in an effort to correctly track PQ (perceptual quantizer) EOTF (electro-optical transfer function) whilst resolving detail up to 4000+ nits, Samsung’s tone-mapping algorithm introduced visible posterisation in what should be smooth gradients (e.g. the skies in The Revenant and The Martian). These banding artefacts could be reduced but not completely eradicated by switching to the [Standard] picture preset or engaging [HDR+ Mode], but once we measured and calibrated these modes to track PQ curve and Rec.2020 colours as close as possible (yes, we leave no stone unturned), the posterization reared its ugly head again. Of course, [HDR+ Mode] involved other compromises too (as we’ll describe in the next section), making it a no go." (HDTVTest Samsung KS9500/KS9800 review)

I wouldn't worry too much about true 10bit panel for entry level/mid range TVs though. Most of them have brightness equal to or less than 400 cd/m2, so even when using 8bit panel with dithering, banding shouldn't be too much of an issue, as RTings has shown.
 
That depends on the definition of "real 10bit". If you're simply after native 10bit panel, then I'll put out the list.

Panasonic : Only the top-end UHD Premium DX902 supports native 10bit panel. DX750 and below, they use 8bit panel.

"Unlike the top-tier DX902 however, the DX750 is not Ultra HD Premium-certified due to lack of native 10-bit panel and inability to hit 1000 nits peak brightness" (HDTVTest.co.uk)


LG : The entire UHD lineup. UH6100 and above.

"The color gradient is pretty good for an 8 bit panel. There are not any banding issues that we can see from our gradient pattern. The only little imperfections that you can see when looking at the picture are more related to the gray uniformity.

Update 09/30/2016: Our original test was showing an incorrect color depth of 8 bit due to some incorrect drivers on our system, but after some correction to our test apparatus, we tested again the color depth and we can confirm that the LG UH6100 does in fact have an 10 bit panel." (RTings)


Sony : The entire UHD lineup. X700D and above.

"The Sony X700D has a good 10 bit panel and the color reproduction is pretty even. There is not really any obvious banding problem that can be seen on our test picture that could cause a problem when watching movies." (RTings)

Samsung : The entire UHD lineup. KU6300 series and above

"The Samsung KU6300 can display our gradient test image fairly well. On our test picture, the gradation is smooth overall in the light shades with some small anomalies in the darker shades, especially in the green color. But it should not be an issue in regular content.

Update 10/26/2016: Our original test was showing 8 bit gradations due to incorrect drivers on our system. After some correction to our test apparatus, we have retested the color depth and found that it is able to display a 10 bit gradient smoothly."


However, I would take RTings interpretation with a huge grain of salt. They say it's due to driver error, but their Samsung KS series reviews at that time still were able to resolve 10bits. PQ EOTF is efficient that 8bit + FRC display can still pack up to 400 nits without banding. Here's HDTVTest's results.

"Although greyscale was flattened after calibration, minor banding and discolouration remained visible on a grey ramp pattern, suggesting that the underlying panel wasn’t true 10-bit." (HDTVTest LG UH7700 review)

So, I'm more inclined to believe only Samsung's SUHD lineups are true 10bits, and for LGs, only the OLED TVs. I'm more inclined to believe Sony's results though as for last years models, they were the only ones along with Panasonic that were able to resolve true 10bit signal when fed a signal generator. The Samsung JS SUHD series and 2015 LG OLED TVs, despite having true 10bit panel were only able to resolve 8bit signal.

That doesn't mean Samsung KS series are home free either. Because they've decided on 4000 nits, tone mapping system, compression was so severe, posterization has appeared in a form of banding.

"Unfortunately in an effort to correctly track PQ (perceptual quantizer) EOTF (electro-optical transfer function) whilst resolving detail up to 4000+ nits, Samsung’s tone-mapping algorithm introduced visible posterisation in what should be smooth gradients (e.g. the skies in The Revenant and The Martian). These banding artefacts could be reduced but not completely eradicated by switching to the [Standard] picture preset or engaging [HDR+ Mode], but once we measured and calibrated these modes to track PQ curve and Rec.2020 colours as close as possible (yes, we leave no stone unturned), the posterization reared its ugly head again. Of course, [HDR+ Mode] involved other compromises too (as we’ll describe in the next section), making it a no go." (HDTVTest Samsung KS9500/KS9800 review)

I wouldn't worry too much about true 10bit panel for entry level/mid range TVs though. Most of them have brightness equal to or less than 400 cd/m2, so even when using 8bit panel with dithering, banding shouldn't be too much of an issue, as RTings has shown.

I have the DX800 and can attest that banding is not really an issue on gradients. I have calibrated many 10-bit panels with poor image processing that show more banding artifacts. If the display has good FRC it really isn't an issue.
 
What's the cheapest 4K TV available in europe with real 10bit panel? The smaller the better (well, let's not go under 40" but 40" exactly would be optimal)
This Sony TV (Sony XBR43X800D) costs 590€...it has a 90% of P3 colour gamut plus HDR10, and reviews aren't bad.

That price is for the 43" model.

http://www.rtings.com/tv/reviews/sony/x800d

edit: not sure it is available in Europe without importing tbh
 
Last edited:
Lg uh6100 despite having 10 bit panel and 10 bit option in the menu with weird name... It only can be used together with HDR.

It should be able to work outside HDR, and with the way they separated the menu option from HDR... But I have not found a way to use 10 bit outside of HDR
 
That's because most mainstream contents only use 10bit in conjunction with PQ EOTF. Same thing with AMD Polaris consumer cards. Has support for HDR games, and UHD BDs, but you will not be able to run Photoshop in 10bit with gamma and AdobeRGB ever. FirePro cards are offered for this very reason. UHD BDs are critisized for this reason. They are 10bit, and PQ EOTF based HDR is on by default. If you turn off HDR, you lose 10bit, and DCI gamut too.
 
That's because most mainstream contents only use 10bit in conjunction with PQ EOTF. Same thing with AMD Polaris consumer cards. Has support for HDR games, and UHD BDs, but you will not be able to run Photoshop in 10bit with gamma and AdobeRGB ever. FirePro cards are offered for this very reason. UHD BDs are critisized for this reason. They are 10bit, and PQ EOTF based HDR is on by default. If you turn off HDR, you lose 10bit, and DCI gamut too.
Wait what? Radeons have been able to run Photoshop etc in 10 and 12bit modes since forever (I know NVIDIA did limit it to Quadro etc, but IIRC they've lift the limit too, AMD never had it in the first place)

((this all is of course assuming you actually had 10 or 12bit monitor))
 
Wait what? Radeons have been able to run Photoshop etc in 10 and 12bit modes since forever (I know NVIDIA did limit it to Quadro etc, but IIRC they've lift the limit too, AMD never had it in the first place)

((this all is of course assuming you actually had 10 or 12bit monitor))

https://community.amd.com/thread/200047

Are you absolutely sure of that? Radeon does support 10/12bit fine, but Adobe has blocked Radeons from accessing 10bit, so just because you can click 10bit/12bit option on Catalyst does not mean you're 100% sure. There are so many conflicting opinions on this issue. My friend with 10bit professional monitor and Radeon 460 still says it's not possible. (10bit option is offered fine, but gradients remain 8bit)

If there is a way for Radeon to output in 10bit gradients with latest version Photoshop, (I already know older versions like CS4 works with hacked drivers) I'd really love to know, so I could pass this good information to my friend. That would mean Polaris Radeon would finally support 10bit buffer under OpenGL, not just DX.
 
Last edited:
https://community.amd.com/thread/200047

Are you absolutely sure of that? Radeon does support 10/12bit fine, but Adobe has blocked Radeons from accessing 10bit, so just because you can click 10bit/12bit option on Catalyst does not mean you're 100% sure. There are so many conflicting opinions on this issue. My friend with 10bit professional monitor and Radeon 460 still says it's not possible. (10bit option is offered fine, but gradients remain 8bit)

If there is a way for Radeon to output in 10bit gradients with latest version Photoshop, (I already know older versions like CS4 works with hacked drivers) I'd really love to know, so I could pass this good information to my friend. That would mean Polaris Radeon would finally support 10bit buffer under OpenGL, not just DX.

They don't. Adobe could implement a DirectX path to allow > 8-bit output on consumer cards or Nvidia/AMD could enable 10bit OpenGL buffers in their consumer cards, but none of them feel it is in their interest to do so.

At least for media playback we have the "madVR" video renderer and "Media Player Dot Net" media player that can achieve >8-bit output by using DX11 Fullscreen Exclusive mode.
 
All HDR TVs must support at least 8bit+FRC. Pure 8bit seem to be possible from PC world (AMD, Pascal), but as expected, banding is severe.

I wouldn't worry too much about specs. What's truly more important is overall balance.

1) Buying budget and can't afford any form of local dimming? = Then pick one that offers good native contrast ratio to begin with. That would mean skipping the LGs.

2) Rank your prioritity. If your priority is motion blur, Quantum Dot color filter TVs like the Samsung KS series have less smearing than Phosphor LED ones used in Sony LCDs (and Samsung's own non KS series) , but Sony TVs can engage strobing (Motionflow Impulse) under game mode which is not possible with Samsung.

3) Trying to double your new TV with movies too? Then, check out color gamut and try to go with the one that offers the largest color gamut you can afford. Make sure it supports 100Hz/120Hz too. Many of Samsung's 6000 series UHD TVs only have 60Hz refresh rate and do not support 24Hz. Some Sony TVs can engage 24Hz under 60Hz though.

4) Are you after color accuracy or just eye pleasing color? If you only play games and want to calibrate your display, then DCI gamut support is not necessary as non of HDR games support DCI as of now. However, that does not stop Samsung from utilizing DCI gamut for their KS series, one of the reason it's so popular with masses, and it was the same case for last year's JS8500 as well.

5) Do you need HDR upscaling? It converts SDR contents into HDR, and many Samsung KS series owners have been very satisfied with this feature, though games see the least benefit with it. The Sony Z9D is the best in this regard, followed by Samsung, then the rest of Sony TVs, and LG's are bottom of the barrel.
 
how do you know that? The review says otherwise, I am confused

http://www.avsforum.com/forum/166-lcd-flat-panel-displays/2277338-2016-sony-xbr-850d.html

Quote:
Originally Posted by Z1-B
Good list but :
FWIW my 55 X850C passes all the 10 bit panel gradient tests (smooth ) and my PC dGPU identifies it as a 10 bit panel .

I think the 55X850C does a good percentage or all of DCI P3 I don't know about the 85" panel


Its just odd that Sony keeps calling it an 8-bit panel when you contact them.


FYI Sony updated there website and a lot of the 2015 Sets got the new 4K HDR label.... except the X850C.

------------------------------------------------------------------------------------------------------------------------



It seems "Straight from the horse's mouth" is the most influential when it comes to this 8bit controversy. Same with Samsung. Someone called them and asked whether the KU6000 series are 10bit or not and they also answered "8bit". Corporation honesty? Or just people who don't even know their own products well?
 
The Hisense M7000 55 and 65" models (900 and 1100€ respectively) are supposedly using 10bit VA panels, at least according to Hisense's documentation and AVForums' review.

I own the 55" model. Is there any test I can do to at home to confirm if it's a 10bit panel or 8bit+FRC?



I just caught this bit from the Eurogamer article ( http://www.eurogamer.net/articles/d...-us-remastered-patch-108-for-ps4-pro-analysed ) and wondered if maybe it's the way to go for those without 4K displays. However the price of the device ($199) seems to be a good part of what a new display would cost, but it might be cheap enough for some to give it a go.


HD Linker: accessing 4K-exclusive Pro features on 1080p screens

Some players with 1080p displays have asked if there is any way to access PS4 Pro game modes only available on 4K screens - specifically down-sampling modes like those stripped from TLOUR. Yes there is, we've tested it and it works - but it is somewhat expensive. You'll need a device known as the HD Linker. It's designed to enable UHD functionality on 1080p screens but that's just one of its many functions.

Users connect the HD Linker between the Pro and a 1080p display and it's powered via USB. The Pro sees that it's attached to an HDCP 2.2-compliant display and the Linker effectively downscales the Pro's 4K output to full HD. We've tested it on a Pro attached to a Panasonic VT20 plasma display and it works just fine - 4K functionality is accessible from the video output settings menu as per normal, even though a full HD screen is attached.

As the Pro believes it is attached to an HDCP 2.2-compliant screen, you can also access non-gaming features too - such as the ultra HD encodes on Netflix, giving an almost Blu-ray-like presentation. At $199, it's effectively half the cost of the console itself so we mean it when we say that it's expensive - but it does do the job.


I just happened to do a little research on the subject because I own a pre-HDCP2.2 AV-Receiver so with the new TV and the PS4 Pro I got into a standoff regarding sound output (HDMI ARC only does 20-year-old Dolby Digital, DTS or stereo PCM).

I wouldn't trust anything from HDFury (brand behind HD Linker) at face value until there are proper reviews (and boy do those guys overcharge for a simple downscaler + HDCP 2.2 -> 14 converter, and the second function can be found by monoprice for $30 (or you could just buy a $25 chinese splitter that so happens to strip away HDCP completely but that may be a legally grey area)).
Not only do you not know how much latency is induced by that thing, but you also don't know how the downsampling is done. If it applies the color of each pixel through the average of the values of a 4 pixel square then I guess it's okay. But if it simply strips a 1080p matrix out of the 4K matrix (e.g. takes only the lower-left pixel out of every 4 pixel square) then you're getting even worse results than if you had 1080p from the start, because the Pro's 4K matrix is usually upscaled from a 1800p/1440p matrix.
And one has to wonder if $200 are ever worth the money just to enable super-sampling on the couple of games that don't do it natively for an old 1080p TV. Which is also a functionality that might even be enabled by the developers themselves through an update.
 
how do you know that? The review says otherwise, I am confused
You can view the specifications here: http://www.displayspecifications.com/en/model/0d686a2
Most panels have 8bit + fcr which is promoted as 10 bit

even the x850D is 8bit + fcr
If you want a real 10 bit panel, you must pay >1000€ at the moment (don't know about the american market). But on the other hand, 8Bit + fcr is better than just 8 bit or 6 bit + fcr
also, I don't know how accurate the page really is, but I have more trust in those than some specifications from a retailer ;)
 
Last edited:
Back
Top