1080p HDR image better than 4k non HDR ?

sorry My Android Keyboard Js Broken

I Have Watchedd HDR in My Shitty Tvv . Pacific Rim, lg Demos, ffxv.

The Rrsukts Are All The Same As I Desxribed A Few Posts Before This. Too Much Contrast Too Much CoLor.

What The Hell Wronf Wjth My Android Keyviard
OH GOD ORANGPELUPA IS POSSESSED
 
Since there's talk about TV's here, I suppose I can throw my question here (mod please remove/move to other thread if needed)

I'm buying a budget TV, that means ~400€ in this case.
There's 2 UHD options available at the price point where I live, both with "pseudo-HDR" aka HDR processing but 8-bit panels and no RFC

One option is Samsung UE40KU6075, and the other LG 43UH603V
40 incher would be easier to put in place, but I'm not sure which one is really better if you disregard the size.
The shop's "technical details" say that Samsung backlight works at 50 Hz, while LGs goes 100 Hz, they also claim that LG would have Direct LED backlight while Samsung has Edge LED, but I found conflicting information about this online.

Anyway, is anyone aware of any reason why I should pick one or the other?

It's going to get most of it's use from my PC, but I also ordered Xbox One S for (4K) BluRays
 
Since there's talk about TV's here, I suppose I can throw my question here (mod please remove/move to other thread if needed)

I'm buying a budget TV, that means ~400€ in this case.
There's 2 UHD options available at the price point where I live, both with "pseudo-HDR" aka HDR processing but 8-bit panels and no RFC

One option is Samsung UE40KU6075, and the other LG 43UH603V
40 incher would be easier to put in place, but I'm not sure which one is really better if you disregard the size.
The shop's "technical details" say that Samsung backlight works at 50 Hz, while LGs goes 100 Hz, they also claim that LG would have Direct LED backlight while Samsung has Edge LED, but I found conflicting information about this online.

Anyway, is anyone aware of any reason why I should pick one or the other?

It's going to get most of it's use from my PC, but I also ordered Xbox One S for (4K) BluRays

I suspect Lg use IPS while Samsung use VA?

then just forget the LG.

Samsung VA have a bit more limited view angle but muuuuuuch better black than LG ips.
 
It's rare (more so with LG), but LG does sometimes use VA panels and Samsung sometimes use IPS panels.

That said, you're probably right; 40" size is typically VA and there's a high chance that the LG is IPS. It's very difficult to find this information these days because the panel lottery is worse than it has ever been. Often times, even within the same model line, you can get IPS in certain sizes and VA in others.

Given the choices, I would agree with orangpelupa and go with the Samsung.
 
Panasonic (I think) announced a new type of IPS panels with very high contrast ratios, thanks to low level blacks and without light leaks...
Might turn out interesting...
But honestly, besides OLED, I don't find any screen good. (And OLED are damn expensive)
 
anyway, my shitty LG HDR 4K TV turns out not so shitty afteral!

it made my PSVR OLED screen looks super shitty with damp color (is that the right word? O_O), shittier contrast ratio, and much more pixely (easily fixable by wearing it without glasses tho)
 
Panasonic (I think) announced a new type of IPS panels with very high contrast ratios, thanks to low level blacks and without light leaks...
True it's a new Panasonic IPS panel manufacturing on existing LED assembly lines and first prototypes are shown January and more so at CES 2017 show. Also first Sony OLED consumer televisions(panels from LG factory) are presented at CES 2017 show. Panasonic is already using LG OLEDs.

(disclaimer: this is all street rumours until CES)
 

Unfortunately those aren't native 4k-projectors and I have to say that is quite a shitty article for not pointing that out. Instead those are native 1080p projectors that use a pixel shift technique. Essentially it works by using a high refresh rate and having the same pixel exist in two different locations in each full frame. I've read that the end result is around the mid point between 1080p and 4K.
Article is quite old BTW, from June. The real 4k projectors sure are taking their time to come down in price. I guess it's hard to squeeze those pixels in there :)

Those are able to accept 4k input though and thus can play Ultra HD movies with HDR.
 
Panasonic (I think) announced a new type of IPS panels with very high contrast ratios, thanks to low level blacks and without light leaks...
Might turn out interesting...
But honestly, besides OLED, I don't find any screen good. (And OLED are damn expensive)
You clearly haven't tried the Z9D in persont:), it trumps Oled any day in HDR content.
 
Right now OLED isn't a good match for HDR TV. I belief that OLED itself can produce enough brightness, but in TV with lots of small pixels that need to be controlled individually, I don't know if it is feasible in the near future.
I think for cheap good enough HDR would probably be the one using that Panasonic tech thing (which looks to be a dual layer LCD) since you can use a simple back lighting instead of led array. This is assuming that making that type of screen is cheaper than using led array (which right now looks really expensive).
 
I'd like to test the current Oled-lineup in a darkish room. I have a feeling that the picture would be pretty nice even in HDR. It's hard to get to test multiple TVs in a proper environment.

BTW. I'm 100% convinced that the store where I bought my TV is going to have a large discount on the 78" Fald Samsung model. The cheapest it's been so far is €7995, but now they have been boosting their stock levels and currently have very large amounts of them in stock and they aren't going to ship them out at €8000. Curious to see when it'll drop. It would be sweet, but have to let this one go.
 
I've finally got the cash to purchase my (semi) reference HDR display for HDR games, but now I may be torn between the two: The 2017 LG and Sony OLEDs. I'm not one to purchase a HDR TV with a half-baked spec, not when my current Panasonic plasma has set the bar very high with SDR games.

Despite the fact OLEDs have twice the PCM (Perceptual Contrast Measurement) factor compared to LCDs, (meaning it will actually hang out with LCDs with twice the peak brightness) OLEDs still need to bring up their brightness game because PQ EOTF is fixed luminance curve, unlike gamma which is able to offer variable luminance. There are already UHD Bluray movies that will defeat display's tone mapping if the peak output is not over 1000 cd/m2. The best solution is of course not rely upon tone mapping at all, which requires the display's ability to resolve EVERY dynamic range from the source signal. And thankfully, today's games are graded at 1000 nit, which the 2017 OLEDs will have absolute no problem handling it with calibration to spare. This kind of fixed luminance also hurts LCDs because 1930 cd/m2 luminance monsters such as the Sony Z9D will only be able to net 1000 cd/m2 out from HDR games, not more.

Unfortunately for the Z9D again, neither the PS4 Pro nor the Z9D offers dynamic metadata which is required for FALDs to get closer to OLEDs in dark scenes. I still remember how much of a beast the Samsung KS9800 has become after it has received dynamic metadata firmware update. Like mentioned previously, that 1000 nit specular highlight will turn black into greys, which hurts LCDs the most. But the reverse is also true. Dim that LCD under 100 cd/m2, and it will offer superior black performance, and this is one of FALD's biggest strength.

I've done my calculation and current SPVA LCDs worsens black level by 65% for every 100% of luminance increase. Still a net gain in contrast ratio but not ideal. Last year's FALD king the Sony X940C's best usable black level measured was 0.00267 cd/m2 (0.00078 foot Lambert) which was very close to Pioneer Kuro 500M's 0.0017 cd/m2.

Assuming the best case scenario for the VA panels

0.03 cd/m2 at 120 cd/m2

It will require backlight's luminance to operate at approximately 3.5 cd/m2 in order to reach such black level. For a FALD, as long as there are no objects that require more than 3.5 cd/m2 within a zone, that would be the most ideal. Unfortunately, that's a rarity and as soon as any object within a zone requires more, then the FALD will have to make a choice between black level and peak luminance. This is why starlight field is one of the most difficult piece to handle for FALDs while they are a piece of cake for plasmas and OLEDs where individual pixels can be illuminated.

Unfortunately, most HDR TVs today only offer static metadata where a display is expected to operate at maximum peak luminance at
any moment at any time. This is not ideal because not every scenes, frames are using full dynamic range, so dynamic metadata offers a solution to allow displays at ideal dynamic range if the source material does not call for it. Samsung KS9800 for example, only operated at static 1000 cd/m2 range when it was first released without dynamic metadata support. With dynamic metadata support included, now it could operate at much lower luminance in dark scenes where 1000 nit support is completely unnecessary, improving black level and shadow detail.

Sad for the FALDs, the XBOs, PS4, PS4 Pro, PC all do not support dynamic range at all as of now. Even the Samsung SUHD owners will only see its benefits in movies only. (where each disc has hardcoded dynamic metadata included already) This means I have absolutely no reason to purchase the Sony Z9D when it will have identical peak luminance as 2017 OLEDs for HDR games, and it will only lose out in shadow detail rendition due to lack of dynamic metadata. The 2017 OLEDs fortunately, will have no need for dynamic metadata for HDR games as it will already have perfect dynamic range from 0.0005 cd/m2 to 1000 cd/m2. The best tone mapping algorithm in the world is the one where you don't have to use it at all.

But this is only part of the problem. Another thing to consider is the color volume. Specular highlights and shadow details aren't really the most important factors. I surely did not purchase my Panasonic plasma only to gawk at night skys and shadow details in the Order or Dead Space. I bought it also to gawk at superior water transparency in Uncharted 4, vastly more transparent particle work in Infamous SS, superb transparency in searchlight in MGS5, more 3D looking bump mapping effect on doors with dragon insignia in Yakuza Kiwami, improved lighting propagation and reflections transparency and perception in 100% of games I've played. Every games is different and this is only possible because my Panasonic plasma is able to keep its reference 0.01 cd/m2 black level at every APL at every pixels, zero compromises. I've purchased 5 HDTVs and none of them were FALDs. Quite a majority of game scenes and area are not dark and in such cases, a FALD not required to perform local dimming will have identical PQ as an edge-lit, quite a turn off IMO. Same thing with the LCDs. I've tried playing Uncharted Collection on Sony X940C FALD and bump mapping and lighting transparency was noticably lacking compared to the PS3 version I've played on the Sony LCD TV I bought 6 years ago. The X940C could only resolve 0.078 cd/m2 ANSI (meaning at 120 cd/m2) while my old non FALD CCFL Sony could do 0.03 cd/m2. More than twice the contrast ratio difference in favour for my old Sony. If ANSI contrast ratio was not an important factor for a FALD design, then the 60 inches IPS Vizio P FALD should have had absolutely no problem destroying any VA edge-lits since the difference between an IPS and VA is much smaller than the difference between a plasma and VA.: (Hint: It can't.)

When it comes to color, there are two factors that compromise its perfect design: black level and ABL. Superior black level will allow for any colors (not just black!) to retain its color purity more. There's also color saturation factor. Higher available brightness will allow a color to access largest color gamut available, which for SDR contents is Rec.709 color gamut. Plasmas with poor ABL performance may still have color purity in upper hand, but it will not be able to access larger part of Rec.709 color gamut which will lose saturation. I've tried playing Super Mario 64 in various windows on my plasmas and also found the smallest window gives the most saturated, kiddy colors that most Nintendo games really need. It's not just the brightness drop when playing in full screen that's problem. It's actually color gamut loss that plasmas lose their saturation.

And this is going to be even more important for HDR contents. For SDR contents, things were simple. Rec.709 color gamut has always been mapped to 100 nits, identical to SDR's luminance dynamic range. For HDR however, color gamut do not always map 1:1 to luminance dynamic range. Today's HDR games are exactly like that. Up to 1000 nits dynamic range, but no DCI-P3 gamut support so Rec.709 has to be used instead and color volume caps out at only 100 nits. Things aren't terribly different for movies either. Sony Z9D with 1930 cd/m2, Samsung KS9800/Panasonic DX900 with 1450 nits, they all have 100 cd/m2 color volume cap despite using DCI-P3. It means DCI-P3 gamut has been compressed all the way down to 100 cd/m2. Solution? Use 3D gamut to map things out 1:1 again. But 3D gamut requires a support from dynamic metadata, and most of HDR TVs don't offer that save for very few Dolby Vision TVs that very few of the games will support. Even the KS SUHD series only offers dynamic metadata in luminance form, 3D gamut is not supported so color volume is just as compressed as LG, Panasonic, and Sony. If Xbox Scorpio is released with 3D gamut support, and future HDR games finally get DCI color gamut, that will leave current PS4 Pros DOA when it comes to color volume.

According to Dolby, DCI-P3 gamut can be mapped all the way to 10000 nits, but current HDR movies cap out at only 4000 nits. (They settled on 4000 nits because it's the endpoint for HDR10 and beginning point for 12bit Dolby Vision. Every Dolby Vision movies are graded in 4000 nits too. Mad Max for example offers 4000 nits on Dolby VIsion, but reduces down to 1000 nits for HDR10 UHD BD) Plans for a sub-carrier larger than DCI-P3 gamut and smaller than Rec.2020 gamut will start once Hollywood decides to increase brightness again. For 10bit PQ EOTF, 4000 nits is the maximum luminance it can go without inducing banding. So, until that day comes, 4000 nits and DCI-P3 gamut will be plenty for many more years. Problem for OLEDs again, is, since DCI gamut is mapped to 4000 nits this time, it will lose out on accessing larger gamut if peak luminance still remains poor. Worst is, tone mapping will not be able to save this saturation loss. This is why OLEDs need to get to 4000 nits ASAP. For games, I think OLED owners will be ok since limiting factor will come from PC monitors with poor HDR performance. I'm guessing even when DCI-P3 arrives on games, peak luminance will remain at 1000 nits for a while. Not every artists will have access to Dolby's super expensive Pulsar grading monitor that can go up to 4000 nits. So, I think my 2017 OLED TV purchase will be very future proof for games with 1000 cd/m2 of peak brightness with 100% DCI-P3 gamut coverage, along with dynamic metadata and 3D gamut support.
 
Another thing that's worrying is current LG WOLED architecture. Because according to Dolby, not every colors are created equal. When BBC & NHK's HLG (Hybrid Log Gamma) broadcast HDR standard has been introduced, Dolby has wasted no time attacking it.

https://www.dolby.com/in/en/technologies/dolby-vision/color-volume-limitations-with-HLG.pdf

Colour BT.2100 PQ Y cd/m2 BT,2100 HLG Y cd/m2

{1,1,1} // Peak White 1,000.0 1,000.0
{1,0,0} // Maximum red 262.7 201.1
{0,1,0} // Maximum green 678.0 627.3
{0,0,1} // Maximum blue 59.3 33.7

Dolby has discovered in their experiment, despite both PQ EOTF and HLG OETF outputting same 1000 cd/m2 white, HLG had less luminances in 3 primaries. Assuming they are both graded in DCI-P3 gamut, HLG will have less color volume access in comparison to the Dolby PQ.

Same thing with the LG OLED. Samsung's mobile RGB OLEDs have perfect colors because they actually fully use 3 primaries that are self-emitting to the fullest extent at any luminance. This is not possible on the LG. Samsung mobile OLEDs are susceptible to burn-in, so LG has decided to use Kodak's WOLED patent instead which enables producing extremely burn-in resistant OLEDs. Instead of 3 RGB subpixels self-illuminating, LG's White OLED method uses stacked OLED color materials that when stacked, produces a white color. This white color goes through a color filter that's located between the stack and subpixels, and for each pixel, three subpixels each have red, blue, and green color filters. So this white light that goes through these color filters regains RGB primaries which can now be combined to make a color.

How does this design mitigrate burn-in? Because burn-in itself can be seen when half-life between R,G,B subpixels are uneven, and in OLED's case (assuming Phosphorus OLED materials are used) , red material has half-life of 250,000 hours, green 400,000 hours, but blue only 20,000 hours. This continues to be a serious problem for Samsung's OLED production. But LG's methods are much simpler. It only uses two colors in a stack, yellow and blue. This yellow material has wavelength between 455nm and 560nm, which means it's actually a yellow material that's veered more towards green, this material alone can replace red and green materials. Together, they produce a white light and color filters pick up just exact balanced amout of red,green,and blue light so uneven luminance balance between three subpixels simply cannot exist on LG OLED TVs. Of course, LG WOLED is not home free yet. Uneven half-life luminance balance still exists on a stack level, so with enough usage, blue will start to emit less light than yellow and this will mean white will start to look more yellow than blue. So, LG has employed another method. Real time white balance correction. There is a sensor that measures color temperature and white balance and if it's being veered in a wrong way, it's processed to change it to correct color balance. This technique has actually been used to early Sony FALDs too when they used to use RGB LEDs instead of White LEDs we use nowadays. FALDs that used RGB LEDs were also very susceptible to uneven dimming zones. The Samsung OLEDs cannot use it because there's nowhere to put processor when light is already illuminated at subpixel level output. LG WOLEDs and FALDs could use this processing because both OLED stacks and LEDs are behind subpixels, meaning more "behind the curtain" jobs can be performed.

Of course, those two aforementioned techniques will ensure balanced color output, but they will still not solve the problem of eventual luminance drop at a pixel level. 20,000 hours blue half-life applies whether it's Samsung or LG.This is where the fourth subpixel comes in, hence the name White Red Green Blue OLED. Because LG has to use color filter, the color filters simply cannot capture every light the stack produces. There has to be wastes. Samsung on the other hand, can have each RGB colors illuminates at a subpixel level, so the entire light can be utilized. So, what should be done with leftover white light? Utilize it as a white subpixel of course! Since white is composed of RGB, whenever a content material calls for a white color output, why waste precious RGB colors when white color can do the same job much better? This is basically the idea. Use an OLED color stack with color filter to mitigrate burn-in, but color filters produce wastes, so use another white subpixel to counterbalance that. A very ingenious solution indeed.

Unfortunately, it still comes with a catch. It will also mean narrower color volume for each RGB colors. Just like Dolby's PQ vs. HLG comparison has shown, Samsung's pure RGB method will contain much more color volume when producing a white color. LG's WOLED will be closer to HLG, more whites, but less RGB. This will mean color saturation will be sacrificed. LG first started out using "Yellow-Blue" stack. Once they decided to make it even more burn-in proof, they went to a three stack, but stll two colors solution: "Blue-Yellow-Blue" stack. two times 20,000 hours = total 40,000 hours blue half-life to spare. Very elegant solution indeed. It has become much more balanced to yellow too.

But it has turned out that yellow alone does not produce enough colors to truly replace red and greens. A single color can only produce so much, so now that LG is deprived of both color gamut and luminance. Since FALDs with gobs of brightness has appeared, LG had to become much more aggressive to combat them. So, now they've gone to more traditional three colors stack (RGB) for next year's model.

Sony has recently released Sony PVM-X550 OLED professional monitor. It also uses Yellow-Blue stack. But for a professional monitor, its brightness was rather low, 400 cd/m2.

Sony has explained the reason: "The panel is rated at 400 nits. Any brighter and the white subpixel starts to wash out the RGB pixels and shrinks the color gamut."

If Sony is to be believed, that means current Y/B stack WOLED LG TV can only attain perfect color volume up to 400 cd/m2 only. Brighter than that, and color volume will start to lose out.

So, going to an RGB stack will have negative impact on LG's current burn-in resistant design, but since it still uses a stack to produce white light, it's still going to be vastly superior to Samsung's RGB design. It will ensure pristine RGB color volume at least up to 1000 cd/m2. Going over to 4000 cd/m2, then it will still remain inadequate. And the brighter it gets, the less color volume the LG WOLEDs will obtain. A compromised design, but at least such compromised design has allowed me to purchase a big-sized OLED TV at a dirt cheap price.

Quantum Dots by the way is more effective solution than WOLEDs when in comes to retaining color volume at higher luminance. Same thing used to happen to LCDs too. When RGB-LEDs were both more burn-in prone and more expensive to produce, Samsung has lead the charge in producing White LED. Blue LED by its design has a very long lifetime, so it was coated with a yellow phosphor and together, they produced a white light, just like the Yellow-Blue WOLED stack. So only one color LED could be used instead of three. However, like WOLED, Y/B combination simply did not have color volume equal to previous RGB LED. Previous RGB LED owners were dissapointed with how colors simply didn't have satuation with White LED. So, QD was introduced so when used as a color filter, nets more light for its color output, improving color saturation even at bright luminance. Of course, traditional LED manufacturers did not sit idle either. Companies like Panasonic has made improvements to yellow phosphors to retain more color volume, so color gamut wise, they're actually wider than QDs.
 
Having bought a 'clever' WRGB display Samsung Note Pro and seen how shitty the yellows are, I'm mistrustful of clever solutions that skirt around having the real colour solutions. All the claims for WRGB were quite frankly bollocks and lies because they had to sacrifice the screen's ability to display content properly. Mention 'additional white' OLEDs and I'm immediately thinking low saturation to the point of the colour being useless.

Basically, all the tech speak in the world means nothing. I need to see a screen in the flesh to know whether it's good or not, and not some amazing science-speak.

The other point that would be surprising if you weren't worldly-wise is that it seemed OLED would be an ideal solution - individual light sources for each primary - but its never that easy. Things like LED half-life means the simple ideal is never realised and panel makers need to jump through hoops. So whenever you come across a Scientific American or New Scientist article extolling a fabulous new tech that's just around the corner and will be better and cheaper and faster, scrunch it up and chuck it in the bin with the methanol fuel cells that power your mobile devices and the SED TVs. The only thing these dream techs are really good for are fleecing investors.
 
I recently bought a 55" Hisense M7000.
It uses a 10bit VA panel with edge-lit local dimming capable of "only" 400nits and for the cost it's a total blast.
So as far as "HDR performance" goes, specialists say it's very limited when compared to OLED or FALD solutions.


Regardless 2 days ago I tried Uncharted 4 with my PS4 Pro (and yesterday I put ~6 hours into it). I'm now playing Uncharted 4 at 4K checkerboard + HDR in my new "mid-end HDR" TV.
Having tried the game before on the regular PS4 without HDR in my old 1080p Plasma from LG, here's my opinion on image quality:

4K HDR >>>>>>> 4K SDR >> FHD SDR.

I can only imagine what a >2000€ FALD or OLED can look like in comparison.
 
Back
Top