HDR on PCs is a failure

snarfbot

Regular
Supporter
hdr as it is currently implemented is terrible.

the whole point is to allow artists to master the media to a standard, and have consumer displays that meet that standard!

as it is now, not only are the content creators not only not mastering to the same standard, but some are being mastered to a 4000nit display! that's absurd, is it intended to be used in direct sunlight, or utilizing the sun as a bias light? It makes no sense, no wonder dynamic meta data is now necessary and included with dolby and hdr10+.

furthermore the end user is able to adjust the brightness and contrast as they see fit, because nobody really ever wanted to look at a 1500nit specular highlight in the first place, again defeating the purpose of hdr as it was intended to set a standard that could be mastered against.

that's not even considering the proliferation of these so called fakehrd displays

afaik all a display has to do to qualify as hdr is be able to accept a 10bit input, even if its unable to display it, and that is unacceptable.
 
afaik all a display has to do to qualify as hdr is be able to accept a 10bit input, even if its unable to display it, and that is unacceptable.
Maybe you missed it, but VESA announced three levels of HDR certification just the other day, to bring order to the chaos we're seeing just now. Apparently testing is already ongoing, and new product launching early next year will meet these certifications.

Btw, why is the text of your post grey? You need to bump your contrast ratio, mate. You're hard to read. :p
 
what is this moderately dark sorcery?

this is so weird, its changing my colors randomly it seems.

if I start a new line and then delete it, like if I type the words "if I" and then backspace 4 times to capitalize the i in if it will change the color of the text, it alternates between green and yellow. I don't know how it became grey. very strange.




watch this:
its defaulting to grey, I'm not sure why.
 
Last edited by a moderator:
Editing the post might leave some code in the background it's picking up. Also if you copy/paste from somewhere else, you should clean it first in notepad otherwise it will retain some richtext formatting/colors from the source. You can remove all formatting in a post using the eraser icon on the right side of the buttons.

It's imperative to leave the main text default without selecting any colors as the theme needs to color it due to light and dark themes.
 
Chrome has this very handy feature where you can right-click and select "paste as plain text"; I use this all the time, as it's IMO really annoying when windows picks up formatting of a quote when copied.
Yeah I have noticed that and trying to remember it so it becomes habit.
 
Maybe you missed it, but VESA announced three levels of HDR certification just the other day, to bring order to the chaos we're seeing just now. Apparently testing is already ongoing, and new product launching early next year will meet these certifications.

Btw, why is the text of your post grey? You need to bump your contrast ratio, mate. You're hard to read. :p
What baffles me is that this stuff is never sprted out from the get go. I'm not even part of the tech industry and the first thing I think when these new buzzwords come out is: "what is the standard gonna be and how will it be enforced and certified" But somehow, the people that do that stuff for a living have not yet learned.
 
Well in a capitalistic society, companies want to be the one getting the licensing fees so they want theirs to be the standard, even if it's not first. So we end up with a couple of licensed brand versions of a standard, often an open one as well, then through financial partnerships with the devices these standards go into, one of the companies eventually spends enough money to make theirs the standard. Often it's not the best one either.
 
Who is to blame? People wanted to buy cheap hdr displays and manufacturers wanted to sell displays. Reality on the other hand was that good hdr displays are still way expensive... Assuming there was proper standard nobody would have had cheap stuff to sell/buy.
 
Well in a capitalistic society, companies want to be the one getting the licensing fees so they want theirs to be the standard, even if it's not first. So we end up with a couple of licensed brand versions of a standard, often an open one as well, then through financial partnerships with the devices these standards go into, one of the companies eventually spends enough money to make theirs the standard. Often it's not the best one either.

The alternative is a non-capitalistic society where you never know if you are getting the best or not, because you only get what the government/society tells you is OK to have. :D Although I guess if there is no competition and no drive to make a better product, you are by default always getting the best of what is available. :p

Sort of like if AMD GPUs were the only GPUs available. They would be, by default, the best.

Regards,
SB
 
the content creators not only not mastering to the same standard, but some are being mastered to a 4000nit display!
It makes no sense, no wonder dynamic meta data is now necessary and included with dolby and hdr10+.
the end user is able to adjust the brightness and contrast as they see fit, because nobody really ever wanted to look at a 1500nit specular highlight in the first place

This is what brightness/contrast controls are designed for - to adjust studio-quality signal mastered in a dark room on a high-performance studio monitor to fit the brighter viewing conditions and less-perfect consumer-grade display in your own room. (Just like DSP room equalization in a multichannel receiver would adjust studio-quality audio mastered on high-end studio monitors in a large soundproof room to fit your smaller non-ideal room and lower-quality speakers).


Also, the maximum brightness of >1000 nit is intended for specular highlights - that is, small areas of the picture that contain sun reflections or other high-intensity point light sources - or very brief full-frame flashes. Average picture brightness and black levels stay the same as for the SDR signal.

https://www.lightillusion.com/uhdtv.html
http://www.studiodaily.com/2016/12/deluxe-talks-dolby-vision-daredevil-and-the-ins-and-outs-of-hdr/


BTW, most 2017 TV sets barely reach 400 nits in HDR mode.


all a display has to do to qualify as hdr is be able to accept a 10bit input, even if its unable to display it

It has to support HLG or PQ transfer functions (i.e. 'gamma curves').

As for >1000 nit of maximum brightness, 10-12-bit processing and BT.2020 color, even high-grade 4K studio displays like $45,000 Sony BMVX300 OLED monitor are currently unable to comply with these requirements set in BT.2100 and EBU Tech 3320 p.2.3.1, so they have to use hard clipping and/or tone mapping.

https://tech.ebu.ch/publications/tech3320

4000 nit Dolby Pulsar and 2000 nit Dolbi Maui displays that can master Dolby Vision HDR content are still in an early prototype stage.


Also most LCD monitors and laptop displays have very mediocre image - they often use TN-based panels which are internally 6-bit with temporal dithering (so-called "FRC"), have bad viewing angles and color uniformity, and do not even cover full sRGB/Rec.709 color gamut readily available in CRT displays.

(And see how they tout color accuracy of Dolby PRM-4220 reference montor which is finally able to match 10 years-old Sony BVM-D series CRT display!)

This could improve soon, as recent VESA DisplayHDR specification requires a panel driver physically capable of at least 8 bit with 2 bit temporal dithering ("FRC"). It also defines maximum brightness - currently 400, 600 and 1000 nits in a 10% window - and wide color gamut with at least 90% DCI-P3 coverage in CIE 1976 uv space (in CIE 1931 x'y' space it's about 86% DCI-P3, 128% sRGB/BT.709, 88% Adobe RGB, and 61% BT.2020); this should be on par with top-range UltraHD Premium LCD-TVs like Samsung MU7000 and Q7F or OLED-TVs like LG B7.

https://displayhdr.org/performance-criteria/


PS. Please kindly remove gray color from your post - you have to turn on BBCode Editor mode (click the wrench at the top right of the edit window) and manually remove [COLOR=rgb(208, 208, 208)][/COLOR] tags.

The alternative is a non-capitalistic society where ... you only get what the government/society tells you
Yes, I do miss my Rubin 714 color 24" cathode ray tube television! Well deserves the USSR Quality Mark - it only required CRT replacement every three years and CRT backorder was a mere 7 months! No current HDR TV would match the weight, power consumption, and inflammability!
 
Last edited:
What was the price for a replacement bulb for that CRT TV, do I even dare to ask? :p

Personally I don't miss CRTs at all, the geometry issues they had alone more than make up for all the quirks modern LED LCDs have.
 
What was the price for a replacement bulb for that CRT TV, do I even dare to ask?
I can't really recall the list price for the thermionic valves, as they were not user-serviceable. There were only 7 tubes onboard as this was a hybrid tube/semiconductor design and most circuits were transistor-based, except for some tetrode and pentode valves. They are currently priced at US $2-3 where they still produce them, and one special triode-pentode combination valve is $12-15.
The TV set was 700 Soviet rubles in 1979; the replacement CRT tube was about 400 rubles in 1984 as far as I remember. The average monthly wage was about 200 rubles at the time.

I don't miss CRTs at all, the geometry issues they had alone more than make up for all the quirks modern LED LCDs have
Flat-screen aperture-grille tubes like Sony Trinitron do not really have any visible distortions, but that particular Soviet-made 24" (61 cm) RGB-triad tube was of cource far less than ideal, with large overscan, pronounced barrel, and washed colors - a replacement 21" Sony TV with the Trinitron tube would instantly display very saturated colors from the same terrestrial signal, on standard settings.
I still have a 2001 Sony KV21-FX30K CRT in my room for the SDTV service, and the colors and the blacks could only be matched by our Panasonic TX-PR55ST50 plasma until very recently, though the latest 4K 'LED' LCD-TVs have become quite good as well. No computer LCD monitor I've seen can yet match the Trinitron in that regard, maybe expect for the Apple Cinema 30" display ...
 
Last edited:
Flat-screen aperture-grille tubes like Sony Trinitron do not really have any visible distortions
Bleh. I had a 19" trinitron Eizo monitor from 1998 with fancy schmancy DSP beam control and shit like that, it still showed distortion, especially when comparing hot/cold conditions. Every time it came out of power save, the image would be slightly different than after it had had time to warm up.

No nostalgia here. Fuck CRTs.

Plus, they irradiate you with X-rays and charge your face and upper body with static, allegedly causing skin irritation.
 
It wasn't all that critical for a television display... anyway, I look forward to seeing these newer wide-gamut HDR LCDs finally surpass color performance of CRTs - as of now, LCD monitors have quite poor uniforminty and color reproduction, except for professional-grade monitors that cost in excess of $5000.
 
Last edited:
Here come test reviews of the first two 'DisplayHDR 600' certified products - Samsung CHG70, a 31.5" 2560x1440 (21:9 WQHD) curved gaming monitor, and Samsung CHG90, a 49" 3840x1080, ultra-wide 32:9, curved gaming monitor. Both use quantum-dot enhancement film (QDEF) layer to improve brightness and colors.

https://www.rtings.com/monitor/reviews/samsung/chg70-curved-gaming-monitor#comparison_1381
http://www.tftcentral.co.uk/reviews/samsung_c32hg70.htm#hdr

https://www.rtings.com/monitor/reviews/samsung/chg90-series-curved-gaming-monitor#comparison_1381

They do achieve maximum brightness of 650 nits (10% window) and wide color gamut with 99% (125%) sRGB, 93% DCI-P3 and 65% BT.2020 coverage, just as expected. Color uniformity could be better though.
 
Last edited:
They could also be a lot smaller... The less bulky model is over 30 inches diag. and only WQHD rez, while the 4K model is massive and basically totally unsuitable as a desktop monitor. Would look ludicrous, and probably stretch the screen into the periphery of your visual field. Having to turn my head to follow the mouse would get really fricken old really fricken fast.

A 27" model is absolutely needed, IMO. 30+ is too big for me. Also gets expensive with these big screens...
 
Back
Top