Best 4K HDR TV's for One X, PS4 Pro [2017-2020]

Status
Not open for further replies.
The HiSense DLLCD is as thin as any other TV. The second LCD layer is thin and doesn't inherently add bulk to the TV design.

dims

Yeah, I was under the impression that the picture had the DL-LCD next to a standard SL-LCD. I wouldn't take that picture to imply anything about the thickness of the television, however, as the LG SK9000 that I recently got while the same thickness as my years old 1080p LG LCD at the edges, is quite significantly thicker than my previous high end LG LCD once you move away from the edges just a little bit. Like 3-4x thicker. I'm guessing this is due to the FALD backlight in the new set (and 4k with HDR) versus Edge lit (and 1080p) in the old set.

Point being, if you look at my sets from the same angle as in that picture, they appear to be the same thickness, but in reality one is much thicker than the other. Which doesn't mean they can't be the same thickness in that picture, just that we can't assume that they are.

Also, now that I've looked at it again. It is notable that Hi-Sense declined to discuss power consumption of the sets with anyone that asked. That makes me wonder just how much power it'll need to have the HDR brightness they are aiming for.

Likely not as much per inch of display as the Sony reference monitor, but likely to be significantly more than current single layer LCD sets. Perhaps Plasma TV levels of power consumption?

So little information is known, but I'm really curious. And I'm still holding out hope that someone makes IPS panel versions. Light bleed was always the main weakness of IP panels, and dual layer panels would seem to be the ideal solution to that problem. Then you could have both good black levels as well as wider viewing angles than single layer VA panels.

IMO, DL-LCD + VA panels just makes their weakness even worse with further narrowing of the already narrow viewing angles.

Regards,
SB
 
Rollable displays are irrelevant for home viewing.

Wrong. I'm the target group. I want auto screen curve feature for my 88 inches OLED, which will not be had without rollable. and your argument holds no water as you're ignorant to future plans of such flexible displays. And contrary to what you say, I've heard plenty from people whose family members hated unsightly TV screen for various reasons.

doubly irrelevant in this thread about Gaming Displays.

Wrong. My ultimate dream gaming rig will be three 150 inches LG 8K OLED TVs in rollable orientation and I will adjust screen curve myself accordingly. Can't do that with non flexible TVs, and LG will not sell 150 screens in inflexible form factor as that will drive up logistics and storage cost too much compared with rollable. And even consoles have just the right content to drive such orientation : Forza, Gran Turismo.

My brother games and use productivity softwares on his 32 inches Dell Ultra widescreen monitor which is curved. Plenty of PC gamers and developers are using Ultrawide screen curved monitors and Samsung's latest Quantum Dot monitors are as of such. Flexible displays are relevant for gaming.


put it out of the way over HDR, refresh rate, uniformity, contrast ratio, etc

There is a no compromise, and that's the beauty of OLED over competition. The first auto curve prototypes already had 60000 roll ups durability. The current ones, easily the double that.

A display has to be suitably sized.

A rollable is easier to carry then flat panels. Have you tried to bring Sharp 90 inches LCD TV to your apartment? Elevator? There is an argument for portability. If there are portable projectors on sale, why can't rollable OLED be?

A 10mm display OTOH is no less desirable than a 1mm display. They are all below the usefully-slim threshold.

I agree but the number of consumers flocked to continuously purchase Samsung's thinnest edge-lits say otherwise.

Your stating that because no LCD has won now, LCD is inferior, but LCD's suffer from weak contrast ratios.

I can't state that decidedly because contrast ratios can mean different things. It was not the first time LCDs had better "contrast ratio" than self emissive displays like plasmas, but still ultimately lost. Take Samsung's best non PSA SPVA panel for example. It had 0.03 cd/m2 of ANSI 50% pattern MLL (Minimum Luminance Level), while holding 120 cd/m2 luminance level (SDR reference) Divide 120 cd/m2 by 0.03 cd/m2 and that will give you 4000:1. Take my plasma for example next. My Panasonic S60 has 4094:1 contrast ratio according to Rtings (https://www.rtings.com/tv/reviews/panasonic/s60) They measured 0.013 cd/m2 black level and 56.22 cd/m2 peak brightness. So yes, those two panels have similar brightness, and for other lower end plasmas like the Panasonic UT50 and U50 has it even worse, like 36 cd/m2 so contrast ratio will be like even lower. With such a mediocre top end, the Samsung should have performed excellently at presenting highlight right? Nope. With such close "ANSI contrast ratio", the Samsung should have performed close to my plasma correct? Not even close. And yes, I do have two such Samsung panel TVs, one from Sony and another from Samsung (FH6400) and did compare side by side. The reason? Floating black. HDR is not something entirely new that favours peak brightness at all. It's just an extension of SDR we have.

If you have LCDs with 1000000:1 contrast ratio and higher peak luminosity than OLEDs, you have better HDR.

No, because of floating black, which is already proven by Vincent Teoh. Do the math yourself. Is dividing 1000 cd/m2 by 0.004 cd/m2 giving you 1,000,000:1 contrast ratio? No, it's 250,000:1. How about dividng by 0.008 cd/m2? It further drops down to 125,000:1. This is only competitive against plasmas (worse than modded Pioneer Kuro) , not OLEDs. OLEDs suffer from no floating black. LCDs, because of their transmissive nature, are inherently flawed with floating black, and the higher peak brightness it goes, the worse the black level will be.

OLED can't be amped up to 1000nits AFAIK.

Panasonic GZ2000 says hi. It does, when uncalibrated.

Now you sound bitter and irrational. If you want to raise a point about power efficiency, which is a fair argument, but all means do so, but don't throw it out there suddenly as a moving goalpost on why OLED is the best thing ever and LCD is teh doomed. Stick to sensible, rational arguments.

No personal sidewaying please. I was simply stating things as is, no ill feeling. Regulations are simply a life, and I do not blame Europe at all for nerfing console specs with leadless solders, killing cadmium Quantum Dots , etc. Even without Europe's lead and energy restriction, plasma would have closed shop unable to scale up to 4K and beyond. But you also have to admit regulations and innovations can't go hand to hand sometimes, and we have to deal with it and move on once that happens.

For the insanely bright HDR, not because it has two layers of LCD. A TV with the same peak brightness as OLED won't run anything like as hot and won't need fans, will it?

No, because of LCD cell's aperture ratio. That's assuming when LCD cells have perfect aperture ratio which they do not. (6% only from light guide plate to viewers) A typical backlit (non FALD) LCD has LEDs located at the back emitting light will go through a light guide plate to tighten up as an array of beams, then goes thorugh a diffuser sheet, a prism sheet, a bottom polarizer, a glass substrate, a TFT, a Liquid Chrystal cell, a common electrodes, then finally goes through color filters, then going further through exterior glass substrate, top polarizer, and top chassis to finally output an image.

Biggest drop of light passthrough occurs at polarizers, an LCD cell, and color filters. Polarizer light lose is 50% as only monodirectional light will be able to pass through. Light output lose from glasses passthrough is 5%. Light output lose from passing through Liquid Chrystal cell is 5% but 45% of passed through light output is rendered worthless, so really, the actual aparture ratio of an LCD cell is only 50%. Color filters are the biggest brightness killers. While light output lose from color filters is only 20%, the remaining 80% has to go through 3 color filters, having to divide light output into three, which will be around 26%.

Now this is what's happening with dual layer LCDs. The light passthrough will be as equal until the arrival at the first LCD cell. It will not need to go through color filters as this cell will only display greyscale, but here's a kicker. Unlike regular LCD cells, It HAS to shine at 1000 cd/m2 at ALL.THE.TIME. Even if we want 100% APL full white screen to only deliver 10 cd/m2, it will still shine at full 1000 cd/m2. Why do we need to produce such excess, wasteful brightness? The rear cell has to be ready to shine at 1000 cd/m2 at anytime, anywhere, that's why. This is the single biggest factor for out of this world power consumption, as no consumer LCD TVs can ever shine 1000 cd/m2 at 100% APL full white screen, the Sony Z9D included. But wait, that's not it. I said going through each LCD cell incurs 50% light penalty each time right? So in order for second cell to pass through 1000 cd/m2 of total light output, the greyscale cell will have to operate at 2000 cd/m2 at 100% APL full screen white just to compensate. Outputting 1000 cd/m2 is already bad enough, imagine being required to output at 2000 cd/m2 or even 4000 cd/m2. This thing will not compete against OLEDs once OLEDs get to over 2000 cd/m2 with TADF in next 5 years. At least this thing will have outstanding ABL performance just like Mini LEDs. Hisense is settling with 1080p greyscale cell instead of 4K cell because they needed to keep aperture ratio in check since they are aiming at even bigger size with 65 inches.


Have you seen the size of Sony's reference OLED?

Here are product manuals for both

https://pro.sony/s3/cms-static-content/uploadfile/55/1237494980655.pdf (Sony OLED BVM X300)

https://pro.sony/s3/2019/01/18022258/4748188111.pdf (Sony BVM HX310)

The OLED's front dimension is 74.24 cm x 46.35 cm. The depth is 13.65 cm not accounting for stand. Maximum power consumption is 280W and typical power consumption is rate at 150W. Product weighs 16.2 Kg

The LCD's front dimension is 77.8 cm x 50.35 cm. The depth is 20.37 cm not accounting for stand. Maximum power consumption is 450W. Product weighs 29 Kg

Still trying to find the actual Hisense consumer pictures, not the show prototype
 
Last edited:
It is notable that Hi-Sense declined to discuss power consumption of the sets with anyone that asked. That makes me wonder just how much power it'll need to have the HDR brightness they are aiming for.

I just did some digging on Weibo and found some infos. Still no luck with the power consumption figure though...

upload_2019-8-5_3-24-15.jpeg

Thickness comes in 12.44mm. That's still not too great considering this is an edge-lit, but still, I guess Hisense did an admirable job trying to slim it down as much as possible.

upload_2019-8-5_3-28-53.jpeg

And that slimming down did incur a compromise, peak brightness and contrast ratio. Peak brightness comes in at only at 515 cd/m2, but what's even more severe is the nasty increase in floating black as I feared. Contrast ratio has plummeted down to 50:1, and I don't think this reviewer is using Klein K10 meter to properly measure black level that's why they all say zero, but I do have a hunch as to what the number might be at 50% APL. It's 0.008 c/m2, same as Sony HX310. How do I know? This model uses BOE IPS panel, for viewing angle reasons obviously. So that makes 50% APL contrast ratio at 59800:1 a far cry from OLED in both contrast ratio and peak brightness. (This actually makes for the dimmest HDR TV that's supposed to be sold at premium)


upload_2019-8-5_3-42-25.jpeg

It also suffers from double polarization distortion, creating artifacts. Hisense cut in a lot of corners to fit in consumer acceptable thickness and power consumption, but it all falls short in the end. Chinese reviews have also mentioned severe uniformity problems, and I blame an edge-lit for factor for that. Double down on panels can increase contrast ratio by quite a few factors. Unfortunately, the flaws also magnify in multitude. One is further narrowing down in viewing angle. Another is uniformity. Since Dual IPS layers require front and back cells to operate, it requires LED backlight located at the back to minimize uniformity issues on the first cell, so that it wouldn't magnify further onto second cell. However, using an edge-lit (Hisense has placed LED strips at bottom of the panel) and relying on light guide only will produce quite big uniformity issues, which will be magnified vastly by the time light reaches the second cell. Hisense should have done like Sony and place backlight at the rear. It will make it quite thick, but still the only method this will work without severe uniformity problems. As usual, we always have to live with compromising choices when in comes to LCDs. My interest in this tech dies now.
 

Attachments

  • upload_2019-8-5_3-24-1.jpeg
    upload_2019-8-5_3-24-1.jpeg
    181.2 KB · Views: 6
Last edited:
Point being, if you look at my sets from the same angle as in that picture, they appear to be the same thickness, but in reality one is much thicker than the other. Which doesn't mean they can't be the same thickness in that picture, just that we can't assume that they are.
True, but I looked specifically at the protrusion from the wall as measure of depth. That should give an idea of the total depth unless the DL LCD is deliberately sunk into the wall to hide its big, fat arse.

Also, now that I've looked at it again. It is notable that Hi-Sense declined to discuss power consumption of the sets with anyone that asked. That makes me wonder just how much power it'll need to have the HDR brightness they are aiming for.
I imagine buckets.

Wrong. My ultimate dream gaming rig will be three 150 inches LG 8K OLED TVs in rollable orientation and I will adjust screen curve myself accordingly.
Can you please pay attention to the thread you're posting in. This is the console forum and a thread where people are asking for buying suggestions for TVs now or the near future - whether to hold off and get the best thing next year and what that thing might be. Gamers who are going to have a PC capable of driving three 150 inch 8K OLEDs clearly aren't the target audience for this discussion. They're not really the target audience for any realistic discussion of gamers and only a miniscule niche.

Panasonic GZ2000 says hi. It does, when uncalibrated.
How does it fair with burn in at those brightnesses?...

The OLED's front dimension is 74.24 cm x 46.35 cm. The depth is 13.65 cm not accounting for stand. Maximum power consumption is 280W and typical power consumption is rate at 150W. Product weighs 16.2 Kg

The LCD's front dimension is 77.8 cm x 50.35 cm. The depth is 20.37 cm not accounting for stand. Maximum power consumption is 450W. Product weighs 29 Kg

Still trying to find the actual Hisense consumer pictures, not the show prototype
If the argument is going to be made that DL LCD is always going to be fat, it needs to explained why looking at the tech. A single additional LCD layer isn't going to do that. The electronics to drive that additional LCD layer isn't going to do that. If you were to double all the parts of the display - polarisers, LCD, colour filters - the screen would double in thickness from a few mm to a few more mm. Therefore, any increase in depth needs to come from elsewhere, which'll be the light driving the display. And as you find, the consumer set isn't a big, fat blob. Yeah, being slim introduces issues, but the tech (as second LCD layer) isn't inherently fat. If LCD can solve the backlight some other way, it may be competitive. There are presently no ideal solutions and both techs offer pros and cons. What's been most interesting over the past 10 years is watching the techs vie with each other, and how few have realised their promises prompting the rivals to challenge them in their areas of weakness. For the purposes of this discussion, DL LCD may be an emergent technology next year.
 
Can you please pay attention to the thread you're posting in. This is the console forum and a thread where people are asking for buying suggestions for TVs now or the near future - whether to hold off and get the best thing next year and what that thing might be. Gamers who are going to have a PC capable of driving three 150 inch 8K OLEDs clearly aren't the target audience for this discussion. They're not really the target audience for any realistic discussion of gamers and only a miniscule niche.

Could you please stop forcing your ignorant and short-sighted definition of a console gamer unto me? I require a curved screen for my console gaming 88 or 98 or 105 inches OLED TV, for FOV reasons, thank you. I've only stated my dream rig. A dream has to start with the first TV.

How does it fair with burn in at those brightnesses?...

Just as good as LGs and Panasonic. Panasonic has custom tuned it to have heat dissipation at the OLED cell level, so it's a bit thicker.

If being argument is going to be made that DL LCD is always going to be fat, it needs to explained why looking at the tech.

Because Sony's product uses PFP phosphors and backlit LED while Hisense's uses Quantum Dot and bottom-lit LED strip. Hisense has prioritized too much on thinness, they made too much compromises about what makes dual cell LCD displays work. Any edge-lits are going to be extremely thin, at the expense of uniformity. Hisense has further tried slimming it down by applying Quantum Dot tube onto their LEDs, allowing it to be further thinner than using Sony's PFP. Quantum Dots are known to be an excellent receptor for tight space applications and OLED is another example. However, both of such moves seem to produce downsides. Looking at the measured color gamut chart of the Hisense U9e, it seems Hisense's original color gamut produced by Quantum Dot tube seem to be lost from translation onto the second cell, which is not the case with the Sony at all. (A full fat 99% DCI-P3 color gamut) The second issue by placing LEDs bottom of the panel is far more serious though. Serious uniformity problem develops and it's compounded by presence of another cells and polarizers/electrodes. Dual layer LCDs, by design require a cell to be placed behind of another cell, and light source also has to be placed behind for optimal uniformity. You can't simply compound contrast ratio and not uniformity issues with an edge-lit. Sony's is thicker because they are doing it right. And let's not forget Hisense's cheating of using 1080p greyscale cell instead of 4K which also contributed to slimming down because of its higher aperture ratio compared to using a 4K cell, but that has made polarizing artifacts show up even more glaringly. The Chinese reviewers seems not too happy with the general cleansiness.

A single additional LCD layer isn't going to do that. The electronics to drive that additional LCD layer isn't going to do that

upload_2019-8-5_5-47-11.jpeg
I was curious as to how many repetitive parts are going into this design and here is what I found. So it's not just 2 Liquid Crystal cells next to each other as I previously simplified too much. There's front glass, PSA, and TFT glass. So the aperture ratio will be worse than my previous assumption.

can solve the backlight some other way, it may be competitive.

You can choose either thinness or light uniformity, not both. Asking an edge-lit light guide to have equal uniformity to backlit is like asking VA to have wider angle and still keep contrast ratio at the same time. It's a fundamental compromising choice.

There are presently no ideal solutions and both techs offer pros and cons.

Are you talking about OLEDs? Because if we are strictly talking about consumer models, there are only cons on Hisense U9e over LG OLEDs.

It's 67% more expensive in parts cost
It has lower peak brightness for HDR
It has worse black level
It has narrower color gamut
It has worse uniformity
It has worse viewing angle
It uses lower resolution greyscale cell, therefore, not a true 4K TV.
It's thicker
It's heavier
It has double polarizing artifacts OLEDs do not suffer from
It has more power consumption than OLEDs at equal HDR peak brightness
The brighter it gets, the more a need of a fan, further increasing thickness and production cost
It has more gloomy roadmap for obtaining more peak brightness, while OLED's future roadmap promises 2000 cd/m2 at similar quantum efficiency



What's been most interesting over the past 10 years is watching the techs vie with each other, and how few have realised their promises prompting the rivals to challenge them in their areas of weakness.

And who's going to fix up the waveguide? Not Samsung, they use VA exclusively. Not LG, they are downsizing on IPS, and their track record for fixing up waveguides wasn't as successful as Samsung. And no way in hell LG will give up their WOLEDs for something that has higher production cost. Not Panasonic. Their massive Sakai plant was retooled as a solar cell manufacturing, and they've spun off their Alpha IPS business and only producing enough for their professional Dual layer LCD panels in a lot smaller Himeji plant, similar as to how Sony has retooled their 3.5G LCD plant to produce their professional OLED BVMs which is anything but mass volume. Sony? No way. They will not waste money invest in something that's going to be DOA at 2000 cd/m2. Sony has become far more profit-oriented than before. And that leaves only Chinese manufacturers like Hisense which is going to be a tall order considering their LCD TVs sold in the US has a ho hum wave guides. Even Hisense seems happier selling OLEDs instead, and they are releasing their OLED TVs in Europe, not their niche bad prototype.


For the purposes of this discussion, DL LCD may be an emergent technology next year.

Good luck trying to get price sensitive gamers to actually buy something even more expensive than an OLED.
 
Last edited:
Could you please stop forcing your ignorant and short-sighted definition of a console gamer unto me?
In my capacity as a moderator trying to keep this thread on topic, no. What happens in 5+ years from now is immaterial in this thread at the moment. At a point when anyone is gaming on 3x8K displays, this thread will be probably have been long closed and a more recent one opened in its stead.
 
as a moderator trying to keep this thread on topic, no.

Uh huh. So a 65 inches rollable TV is off topic because it's niche product, yet the Sony HX310 that's even more of a niche for gamers (£30000 for 31 inches, the product itself is not even targeted for gamers because it doesn't even support tone mapping at all, cutting off 75% of HDR highlights of Assassin's Creed : Origins) is on topic? Yeah, love your hypocrisy as a mod. From now on, I will be ignoring your provocative comments that adds nothing to the discussion.
 
The £30,000 TV isn't a product for gamers and wasn't presented as such. It was presented as an example of an emergent tech that may appear in consumer-grade TVs for current console owners to consider.
 
Well, an emergent tech (ironic considering this tech was developed in 2005) needs have a strong business case for it to trickle down to consumer TVs, so that investment (including R&D) won't just turn into sunk cost, which will be off topic for this thread, so I won't go on further.
 
Last edited:
Why are we to assume that DLLCD will be more expensive than OLED? Hisense plans on releasing that DL in China at around $2600 or about 25% cheaper than its OLED product at 65”.
 
Do they really have a choice? If they want to continue selling lcd panels they have to be competitive with oleds.
 
Why are we to assume that DLLCD will be more expensive than OLED? Hisense plans on releasing that DL in China at around $2600 or about 25% cheaper than its OLED product at 65”.

Because Hisense HAS TO purchase their OLED panels from LG, while they own their own LCD panel manufacturing. If LG makes a DL using their own IPS panels, it will be at least 68% more expensive to produce than their OLEDs. If a fabless like Sony makes both OLED and DL, it will be 68% more expensive to produce.

Do they really have a choice? If they want to continue selling lcd panels they have to be competitive with oleds.

Dual Cells are fundamentally not competitive against OLEDs because of they cost more to produce, and OLEDs are quickly closing the cost gap, not the other way around. OLEDs have already conquered mobile, and LTPS LCDs offered nothing to counter.
 
Why are we to assume that DLLCD will be more expensive than OLED? Hisense plans on releasing that DL in China at around $2600 or about 25% cheaper than its OLED product at 65”.

LG's OLED's are already cheaper than competing premium LCD TVs and roughly equal or cheaper than many midrange LCD TVs from the bigger manufacturer's. While Chinese manufacturer's are cheaper, they also have incredibly inconsistent quality. For example, I passed on the TCL 6 series (One of Rtings recommended sets) due to the panel lottery that is involved when buying one in addition to the fact that they use VA panels. But the panel lottery killed it even before I started to think about the compromises involved if I were to try to use a VA panel here.

You can see that coming through in what little "review" information has come out about the HiSense that KOF has referenced. There are a lot of problems with their sets that are not just a result of the extreme compromises they made to achieve a lower price point, but also likely part of the inconsistent panel quality that seems to plague Chinese manufacturers when it comes to their own products versus products made for higher tier non-Chinese contractors. IE - QA is less strict for products made for their own lines, which in turn allows them to make those products cheaper and sell them more cheaply compared to the higher tier manufacturers. This is even more true if those Chinese manufacturers don't manufacture products for higher tier non-Chinese companies.

So, you're adding not only the cost of a second layer, but the cost of bonding it to the original color layer. Then you'll likely also need more backlights in order to achieve the same brightness as without the second layer. With increased backlighting it's possible they may have to be more creative in cooling the whole thing. Then add on the increased cost of additional quality assurance which is necessary to maintain a consistent level of quality for a released product.

Basically if HiSense is targeting 2600 USD in China, the sets made by Samsung, Sony, and other similar tier TV makers will likely be significantly more for the rest of the world.

Of course all of this is speculation until a DL-LCD set is released outside of China by one of the big non-Chinese manufacturers. It's notable that the reason HiSense isn't releasing the set outside of China might be because they don't want to suffer from negative press from reviews from Western review sites.

In solving the issues that the HiSense TV suffers from, expect it to lead to increased cost to manufacture. I'll be surprised if HiSense or others can solve those issues without increasing the cost...at least in the near future (next 12 months).

Regards,
SB
 
Last edited:
Because Hisense HAS TO purchase their OLED panels from LG, while they own their own LCD panel manufacturing. If LG makes a DL using their own IPS panels, it will be at least 68% more expensive to produce than their OLEDs. If a fabless like Sony makes both OLED and DL, it will be 68% more expensive to produce.



Dual Cells are fundamentally not competitive against OLEDs because of they cost more to produce, and OLEDs are quickly closing the cost gap, not the other way around. OLEDs have already conquered mobile, and LTPS LCDs offered nothing to counter.

Hisense manufactures their DC panels? The articles I’ve seen say Hisense sources their DC panels from China’s BOE.
 
Hisense manufactures their DC panels? The articles I’ve seen say Hisense sources their DC panels from China’s BOE.

Sorry got confused with TCL. Hisense is still a state owned enterprise and BOE display has been receiving government subsidies for a long while, so their partnership is very tight-nit together. It only proves Korean LCD manufactures can no longer compete against the Chinese just like Japanese LCD manufactures couldn't compete against the Koreans back then, so LGD's decision to downsize their IPS business makes a sense as they have been shedding loss for a quite a while due to their lack of competitiveness. LGD will not overtake BOE's marketshare ever again. (LGD is still ahead when monitor panels are taken into account though) Samsung is next. Chinese market is so cut throat, I'm always amazed whenever I see prices for their TVs on sale in China.
 
Sorry got confused with TCL. Hisense is still a state owned enterprise and BOE display has been receiving government subsidies for a long while, so their partnership is very tight-nit together. It only proves Korean LCD manufactures can no longer compete against the Chinese just like Japanese LCD manufactures couldn't compete against the Koreans back then, so LGD's decision to downsize their IPS business makes a sense as they have been shedding loss for a quite a while due to their lack of competitiveness. LGD will not overtake BOE's marketshare ever again. (LGD is still ahead when monitor panels are taken into account though) Samsung is next. Chinese market is so cut throat, I'm always amazed whenever I see prices for their TVs on sale in China.

That may not be the case because Samsung essentially buys cheap LCDs, add it’s QD layer and sells those products in its premium TVs.
 
There are a lot of problems with their sets that are not just a result of the extreme compromises they made to achieve a lower price point, but also likely part of the inconsistent panel quality that seems to plague Chinese manufacturers when it comes to their own products versus products made for higher tier non-Chinese contractors. IE - QA is less strict for products made for their own lines, which in turn allows them to make those products cheaper and sell them more cheaply compared to the higher tier manufacturers. This is even more true if those Chinese manufacturers don't manufacture products for higher tier non-Chinese companies.

Sad but true. Sharp has filed a lawsuit under similar complaints.

"In June 2017, Hisense was sued by Sharp under its new owner Foxconn, seeking to have the licence agreement halted. Sharp accused Hisense of damaging its brand equity by utilizing its trademarks on products it deemed to be "shoddily manufactured", including those that it believed to have violated U.S. safety standards for electromagnetic radiation, and deceptive advertising of their quality."

I consider Hisense U9e as a classic example of attempting to create a higher-tier value-added product that ultimately fell short because compromises also had to be made to meet consumers expectation in a saturated market. Chinese are very proud people and they have always wanted to beat Korean and Japanese display manufacturers. 10 years ago, this was a bluffing, but now, it's more than ever. However, consumers have been (especially the Chinese) on some cheap panels for quite a while and there doesn't seem to be any way of reversing this. What Hisense should have done was to make it a back-lit, making it thicker but uniformity would have been much better. They should also went all out with 1000 nits with extreme power consumption and further increased thickness and add a fan while they're at it. They should have binned panels for best uniformity. They should not have released this product until they fix up paralaxing issue. They should not have cheapen out with 1080p greyscale neither. Only then, they should now do some estimates on how much price increase it will have and try to make it as much as possible. Then, they should have sent their review units to western review sites for favorable reviews. Then they release it in China as an halo statement product.

However, they did not do that. By going for an edge-lit and chasing thinness instead, they've cheapened out the biggest reason to own this display. Premium prestige. Nobody gives a prestige award to someone trying to cut corners. If Samsung has failed to make people think edge-lits were premium, what makes Hisense that they can do that? So, a good idea became a shamble in the end. They spread too far trying to match OLED's every pros and in the end, they only ended up with cons. They could not ignore Chinese consumers who thinks thin OLEDs are premium. They could not ignore Chinese consumers low energy cost is a value added feature. They could not ignore Chinese consumers who refuse to pay a big money on TV at all. So, instead of a proper way of trying to catch up to Koreans, they have done the dishonest thing. The consumer U9e was nothing like the ULED XD (the prototype could do 2900 cd/m2 peak. It's definitely not meant for a consumer product at all) shown at the shows. They tried their best to show off only their tech's strengths, but not weaknesses. Instead of trying to improve product, they were forced to play marketing instead, by making a prototype that has no consideration of actual market, and also developing a half-assed product that no longer resemble what has made this tech look good in the first place.
 
That may not be the case because Samsung essentially buys cheap LCDs, add it’s QD layer and sells those products in its premium TVs.

Yeah, Samsung's panel partnership with CSOL, just like their previous S-LCD partnership with Sony. I do think that will buy Samsung some time, but LG was not as lucky. Just kidding, they have their OLED cartel thankfully. They will be more than fine.

One of the person who saw the Hisense U9e has said it only looks little better than FALD. And since Samsung has already developed tri-domain SPVA that further improves black level, they are actually one of the least likely to go with the Dual Layer. VAs perform even worse dealing with parallaxing issues than IPS panels and they will not mass produce something using panels of panel makers they are not friends of (BOE and LG) Quantum Dots will also prove to be a detterrant as proven by this Hisense design which suffered from color gamut loss even when going with QD. Seeing as Samsung is the world leader when it comes to wide color gamut, they are not going to be happy. If Samsung doesn't want to suffer color gamut lose, they should use PFP, but they already use PFP on their cheap TV models, further complicating matters trying to value add.
 
Last edited:
Status
Not open for further replies.
Back
Top