Nature review article about display technologies

So 3500 hours is what a typical family on the low side in the US will have after 2 years or around 1 year for prosumer users. I'm interested in what the results will be at 10,000 hours.

I have 30740 hours on my slightly under 9 year old Panasonic 54" Plasma.
 
Especially if the effect is accumulative instead of based solely on time-on-screen for more static content. That's truly troubling for non-TV use.
 
How many LG OLEDs suffer from the same deffective firmware? Are all the burn in user reports judt from LG OLED owners? Don't be se quick to dismiss the problem.

Not many, a small percentage of the original batch was affected and the problem has since been addressed.

For sure, people that regularly grade their panels using a colorimeter to asses color accuracy and uniformity are going to notice.

For just about anyone else, outside of looking at a screen showing a solid color the panel is unlikely to show any perceivable permanent burn-in outside of worst case pathological cases.

And even in pathological cases (like the Rtings burn-in testing that is ongoing), it's unnoticeable unless viewing a solid color for many of those pathological worst case scenarios.

Additionally, even when viewing a solid color displaying the burn-in, the panel uniformity is often still better than a brand new LCD panel.

If anything, the Rtings content that you've shown reinforces the notion that your average consumer (TV viewer or console game player) isn't going to perceive any noticeable degradation in the panel.

Granted, 3500 hours equates to 437.5 days at 8 hours a day or 875 days at 4 hours a day of showing the exact same content. Perhaps the burn-in accelerates over time, but if it's linear then it's safe to assume that under anything but the worst case, a modern LG OLED display should still have superior color uniformity when compared to an LCD after 3-5 years. Those that only watch CNN and nothing but CNN, might have cause for concern, however. :p

This isn't dismissing the effects of burn-in (I'm not going to get one to use as a PC display no matter how much I want to at this moment). This is putting it into the context of what most people are likely to see under either general TV/movie viewing or general console gameplay (most console gamers likely don't spend 3000+ hours playing the same game and even then the burn-in doesn't significantly impact the perceived quality of non-single color images).

The one major area of concern for a console gamer is going to be the main console UI if it's mostly a single color (no background image selected so burn-in would be more noticable). But how many console gamers haven't customized their UI with a background image that they like?

The situation with OLEDs can certainly still be improved, but it's certainly not doom and gloom.

Regards,
SB
 
Last edited:
For just about anyone else, outside of looking at a screen showing a solid color the panel is unlikely to show any perceivable permanent burn-in outside of worst case pathological cases.
The only tests that matter are viewing the TV and if, while watching TV programmes, you can notice burn in. Viewing solid colours isn't the right quantitive assessment. They should show what these TVs look like displaying normal TV, to see how visible the artefacts are. But of course ,that would interrupted their scientific method. They could at least display a few photos as well as flat colours in their assessment.

Truth is, if you watch CNN 24 hours a day, you'll get burn in. But you'll not ever notice it because you're watching CNN all the time. Only if you swapped away from CNN might you notice - we only know the artefacts are discernible in certain flat-colour test.
 
Additionally, even when viewing a solid color displaying the burn-in, the panel uniformity is often still better than a brand new LCD panel.

If you are trying to compare the emissive displays spot noise phenomena to the diffuse backlit LCD's gamma shift, not sure that's going to work for drawing an useful conclusion .
 
Temporary image retention and burn-in is noticable during dark scenes, such as the ending credit scenes of movies or stary skies or nightime driving scenes or walking through a dark house. You will notice the old text or logos are still there like an after-image while the section should be solid black. Its extremely noticable to me, probabky because I know what to look for, any non-color-uniformity.

At least if OLED behaves anything like Plasma.
 
Again, you're either just trolling or incapable of getting the point.

https://www.rtings.com/tv/learn/real-life-oled-burn-in-test

So they are showing the same type (not the same video) of content for 20 hours a day. They are up to week 24 so that is close to 3500 hours of screen on time?

The only obvious burn in is on the TV showing CNN at Max brightness, and even then it's only visible when showing a single color, not when you're actually watching content. At least that's what I understand from it.

As the amount of users watching CNN at full brightness (or watch only one type of content for that matter) will be extremely small, couldn't you conclude that based on the test data so far, the chance of perceivable burn in will actually be very small/non existent for anybody using their tv in a normal way?
Burn in was present already at week 2 of the test, so only a couple hundred hours, not 3000. Also, it was the dimmer TV displaying CNN which got the worst burn in ;)

Not many, a small percentage of the original batch was affected and the problem has since been addressed.

For sure, people that regularly grade their panels using a colorimeter to asses color accuracy and uniformity are going to notice.

For just about anyone else, outside of looking at a screen showing a solid color the panel is unlikely to show any perceivable permanent burn-in outside of worst case pathological cases.

And even in pathological cases (like the Rtings burn-in testing that is ongoing), it's unnoticeable unless viewing a solid color for many of those pathological worst case scenarios.

Additionally, even when viewing a solid color displaying the burn-in, the panel uniformity is often still better than a brand new LCD panel.

If anything, the Rtings content that you've shown reinforces the notion that your average consumer (TV viewer or console game player) isn't going to perceive any noticeable degradation in the panel.

Granted, 3500 hours equates to 437.5 days at 8 hours a day or 875 days at 4 hours a day of showing the exact same content. Perhaps the burn-in accelerates over time, but if it's linear then it's safe to assume that under anything but the worst case, a modern LG OLED display should still have superior color uniformity when compared to an LCD after 3-5 years. Those that only watch CNN and nothing but CNN, might have cause for concern, however. :p

This isn't dismissing the effects of burn-in (I'm not going to get one to use as a PC display no matter how much I want to at this moment). This is putting it into the context of what most people are likely to see under either general TV/movie viewing or general console gameplay (most console gamers likely don't spend 3000+ hours playing the same game and even then the burn-in doesn't significantly impact the perceived quality of non-single color images).

The one major area of concern for a console gamer is going to be the main console UI if it's mostly a single color (no background image selected so burn-in would be more noticable). But how many console gamers haven't customized their UI with a background image that they like?

The situation with OLEDs can certainly still be improved, but it's certainly not doom and gloom.

Regards,
SB
The fix was unrelated to user burn in.
OLED targets the pro consumer market, precisely the people who do calibrate their displays and do care very much about PQ and therefore notice burn in when it happens, as confirmed by user reports.
As I said above, burn in was present already at week 2 of the test.
Nobody claimed doom and gloom.
 
So why do you consider user reports valid when they complain about burn in, but invalid when they don't? You claim to be the only one providing empirical evidence despite users of OLED on this very board saying they aren't seeing issues. The magnitude of a product problem can by-and-large these days be determined by the amount of internet chatter. We knew there was an epic problem with early XB360s long before MS admitted it because of the sheer number of users reporting issues. We knew PS3 also had significant problems given the sheer number of reports of YLoD. Unless lots of OLED users are experiencing burn-in but not talking about it ("yeah, I spent $5000 on a new a TV and within a year I have a channel logo burnt into the corner, but I'm not complaining"), the empirical in-the-field data would suggest that burn-in isn't a problem with current users.
 
Never had problems with Plasmas or OLEDs. People who don't pay any attention could get into trouble theoretically. I prefer that over dozens of inhomogeneous lighting errors that LCD technology automatically brings along and which I can't influence. Many also use OLEDs as monitors and have no problems.

These burn in tests are is unrealistic and the screens are certainly not professionally adjusted. Probably set too bright etc.


 
You think RTings didnt calibrate their sets before starting their tests? Or that they're not professionals?
 
(...)dozens of inhomogeneous lighting errors that LCD technology automatically brings along and which I can't influence.

If you have active stereo glasses nearby, perhaps it's helpful to check some time what kind of "inhomogeneity" it has ,

next to ZERO on axis , considerable contrast, colorshift only over >30° .

Therefore with diffuser after LC layer, not BEFORE, LCD can be free of color/gamma shift, also in near eye displays , where light outside of 20-30° cone is not even collected (bringing along 3x efficiency advantage compared to the usual emissive lambertian emitter).
 
So why do you consider user reports valid when they complain about burn in, but invalid when they don't? You claim to be the only one providing empirical evidence despite users of OLED on this very board saying they aren't seeing issues. The magnitude of a product problem can by-and-large these days be determined by the amount of internet chatter. We knew there was an epic problem with early XB360s long before MS admitted it because of the sheer number of users reporting issues. We knew PS3 also had significant problems given the sheer number of reports of YLoD. Unless lots of OLED users are experiencing burn-in but not talking about it ("yeah, I spent $5000 on a new a TV and within a year I have a channel logo burnt into the corner, but I'm not complaining"), the empirical in-the-field data would suggest that burn-in isn't a problem with current users.
The empirical evidence I posted comes from the RTINGS test.
The user reports of people who got burn-in are just as valid as those of the people who haven't.
As you know, even accounting for the variance in panel quality, burn in depends on user habits, which we don't know.
Also, it's very disingenuous to compare burn in, which varies in severity on a case by case basis, with the XB360's RRoD which is a catastrophic failure.
Finally, how likely are TV users to complain on forums compared to gamers? Another important factor we don't have data on.

Right, so you keep on insinuating burn in is a big issues in real world usage yet you are unable to provide any actual statistics.

Maybe now you can finally stop repeating yourself and ignoring everything other people post.
Oh, so you're arguing against something you imagine I'm saying instead of the things I'm actually saying. That explains it all. The reality is that neither you nor I know the extent of the issue. The difference between us being that I actually acknowledge that fact.

Never had problems with Plasmas or OLEDs. People who don't pay any attention could get into trouble theoretically. I prefer that over dozens of inhomogeneous lighting errors that LCD technology automatically brings along and which I can't influence. Many also use OLEDs as monitors and have no problems.

These burn in tests are is unrealistic and the screens are certainly not professionally adjusted. Probably set too bright etc.

At least for the cases of panel unifomity and contrast levels, you can use bias lighting to strongly mitigate the issues. It's recommended to reduce eye strain as well.

Do most OLED users have their TVs profesionally asjusted? I doubt it. Also, the screen that got the most burn in was at half brightness.

Don't you think you should actually watch the videos before you comment about them? Just a thought.
 
The user reports of people who got burn-in are just as valid as those of the people who haven't.
Of course they are. But to gain understanding of the problem, one needs to quantify these data. Otherwise you just have proof burn-in happens and not likelihood of burn-in happening or impact when it happens.
Finally, how likely are TV users to complain on forums compared to gamers? Another important factor we don't have data on.
Users on AVSForums are as likely to post about their new TV suffering burn-in as gamers are to post on gaming forums. You can see comments on the Rtings tests where users have burn-in. We all know plasma had issues because people talked about it, and we all know plasma largely managed to get over those issues by-and-large because people on forums such as these continue to rave about the picture quality of their beloved plasma displays.

Oh, so you're arguing against something you imagine I'm saying instead of the things I'm actually saying. That explains it all. The reality is that neither you nor I know the extent of the issue. The difference between us being that I actually acknowledge that fact.
Why do you keep saying that when it's been repeated that everyone acknowledges burn-in happens on OLEDs? Not one person has said it doesn't happen. The closest anyone got to saying it doesn't happen was one person calling it 'not a real problem' to mean 'not a problem that is going to impact users.' Not one person has denied that OLEDs can suffer localised pixel degradation. Stop trying to make that argument because it's already made. Again, we're trying to discuss what degree of impact this technological limitation has and whether it's outweighed by gains in other areas.
 
Of course they are. But to gain understanding of the problem, one needs to quantify these data. Otherwise you just have proof burn-in happens and not likelihood of burn-in happening or impact when it happens.
Users on AVSForums are as likely to post about their new TV suffering burn-in as gamers are to post on gaming forums. You can see comments on the Rtings tests where users have burn-in. We all know plasma had issues because people talked about it, and we all know plasma largely managed to get over those issues by-and-large because people on forums such as these continue to rave about the picture quality of their beloved plasma displays.

Why do you keep saying that when it's been repeated that everyone acknowledges burn-in happens on OLEDs? Not one person has said it doesn't happen. The closest anyone got to saying it doesn't happen was one person calling it 'not a real problem' to mean 'not a problem that is going to impact users.' Not one person has denied that OLEDs can suffer localised pixel degradation. Stop trying to make that argument because it's already made. Again, we're trying to discuss what degree of impact this technological limitation has and whether it's outweighed by gains in other areas.
1) Internet posts are useful to detect a problem and document it on a case by case basis but they can't be used for a statistical analysis since we have no idea how representative the sample is.

2) Yes, there are OLED owners who post on AVS. Is it the same percentage as gamers who complain on the internet? I don't know. Do you?

3) Everbody has acknowledged the fact that burn in happens. I'm not disputing that. I think we also agree that we don't know have enough date to assert how widespread the problem is. The difference is in how we're acting on that fact. I claim that since there's no hard data people should be wary of the issue when evaluating the technology. You and others claim that since we don't have hard data that means the problem is minimal. That's a logical fallacy:

https://en.m.wikipedia.org/wiki/Argument_from_ignorance

"It asserts that a proposition is true because it has not yet been proven false or a proposition is false because it has not yet been proven true. This represents a type of false dichotomy in that it excludes a third option, which is that there may have been an insufficient investigation, and therefore there is insufficient information to prove the proposition be either true or false".

In other words, you're jumping to conclusions.
 
That post is far mellower than your position to date and everything everyone has argued about - you have categorically asserted that burn-in happens and is a threat, without moderation. When presented with people suggesting it might not be a problem in practical use, or a problem of less impact to viewing than the problems of LCD, you have repeated that burn-in happens. This post of yours is exactly true, and the position everyone else in this discussion is entering from, and what everyone else has been wanting to discuss, only to be side-tracked by you constantly returning to the 'does it happen' discussion instead of the 'how much does it happen' one.

Hopefully now we can discuss the technical limits of the different display technologies having reached consensus on what exactly is being discussed.

On that matter, I haven't read the whole article yet but it has pointed out to me that mobiles use different OLED tech to TVs, so don't offer a direct parallel (plus of course totally different usage patterns). OLED TVs themselves are too new to offer much field testing, and the technology is progressing rapidly, so even if an older set suffers burn-in, it's hard to predict how a newer set might fair. I think it safe to say there will be image degradation over several years use, if just the colour shifting slightly across the picture. But of course this will probably be imperceptible in use, which is where the arguments come in favour of OLED, that the perfect blacks and clarity across the whole spectrum of content brightnesses are worth a display that'll lose a percentage of its prowess over time.

For gaming, motion blur is a big factor too. What good is a display capable of brilliant stills if everything is a smeary mess in motion? That's an aspect I haven't heard discussed about OLED versus LCD.
 
At least for the cases of panel unifomity and contrast levels, you can use bias lighting to strongly mitigate the issues. It's recommended to reduce eye strain as well.

Do most OLED users have their TVs profesionally asjusted? I doubt it. Also, the screen that got the most burn in was at half brightness.

Don't you think you should actually watch the videos before you comment about them? Just a thought.

I dont trust anyone who hypes the contrast performance of TVs when there were televisions with a 10 times higher contrast 10 years ago.

If one watches the same channel 20 hours a day for 6 months, then he should buy a LCD.
 
Back
Top