1080p HDR image better than 4k non HDR ?

Is 8bit+fcr worth double the price over 8bit with "hdr processing"?
Well, I don't know if it is worth it. But if the panel is not able to display 10 bit-colors, you don't get the enhancements of the color gamut which is an important component of the HDR quality increase. without it you can easily see color-banding when the contrast increases.
In my opinion, if you invest in a new TV to get HDR capability you should at least get a 10 bit panel and support for the 10-bit color gamut. Else keep your good "old" full HD TV or get a cheaper one without HDR .


sry for double post, my 10 minute-time-window closed :)
 
Well, I don't know if it is worth it. But if the panel is not able to display 10 bit-colors, you don't get the enhancements of the color gamut which is an important component of the HDR quality increase. without it you can easily see color-banding when the contrast increases.
In my opinion, if you invest in a new TV to get HDR capability you should at least get a 10 bit panel and support for the 10-bit color gamut. Else keep your good "old" full HD TV or get a cheaper one without HDR .


sry for double post, my 10 minute-time-window closed :)
My old TV is actually old, it's some 6-7 years old already.
The 4K "8bit 'HDR'" costs 399€ (Samsung 40"), it's at the same time the cheapest 4K TV available here, "HDR" or not
 
Does anyone have experience with playing Rec.709 content on a new wide gamut screen? VA panels have infamous colour shifts when viewing from an angle and I can imagine green primary in Rec.709 will have different hue when viewed from an angle than when viewed straight on. It certainly was noticable on my old TV which was AdobeRGB/xvYCC capable.
 
Hmm i usually watch and don't participate a lot in these forums,
and one of the threads i have been following has been this one.
I work in the professional video world. ( my company builds SDI /HDMI hardware) so we need to be all over the UHD/HDR stuff, but honestly often we are confused as everyone else.

But thought i would jump in to say, that i am happy to answer any Q.'s IF I CAN...

eg. I have a large amount of experience playing rec.709 on a HDR screen, we have the SONY BVMX300 ( a pro level mastering monitor) and we still put a lot of HDR content on it.
Non HDR content does not look especially poor or washed on it, but it's the best of the best, not a likely match for what you will get in your home TV display.

As for me, i would take 1080p60 with full HDR, over UHD/4K with no HDR any day of the week. but thats just me.
 
And I'm stuck with 3K TV and shitty HDR that looks cartoony.

Meh. Good enough for movies tho. It's for games that looks awful.
 
I work in the professional video world. ( my company builds SDI /HDMI hardware) so we need to be all over the UHD/HDR stuff, but honestly often we are confused as everyone else. But thought i would jump in to say, that i am happy to answer any Q.'s IF I CAN...
Opinions about the random incompatibility issues between 4k+hdcp+hdr screens and gaming consoles? Is it the specs are so overly complex, abstract and everyone trying to implement similar firmware code (mostly C,C++ I guess) of their own. Mistakes repeated around the industry and valuable information by the experience is not publicly shared? Manufacturers need to patch firmwares case by case for various 4k playback devices as we have already seen TV brands do.

ps: What was your choice of 4k+hdr television model at home?
 
eg. I have a large amount of experience playing rec.709 on a HDR screen, we have the SONY BVMX300 ( a pro level mastering monitor) and we still put a lot of HDR content on it.
Non HDR content does not look especially poor or washed on it, but it's the best of the best, not a likely match for what you will get in your home TV display.
.

SONY BVMX300 is an OLED so sure there will be no issues with the viewing angle.
 
Amy idea why 10 bit always bundled with HDR?

Because it's the minimum spec to meet the UHD Premium standards. And going from 8 bits to 10 bits seems to affect both luminance and chrominance. You are going from 16.77 million colors to 1.07 billions colors and 256 shades of gray to 1024 shades of gray. I'll add the disclaimer that that's my interpretation of my limited trip to try to understand HDR as a TV technology, so I maybe wrong.
 
Last edited:
Amy idea why 10 bit always bundled with HDR?
because "HDR" is a really "cheap" checklist feature, but 10bit panels are not cheap. So if you spent money for a ten bit panel, you should always have HDR onboard (at least the new TVs).
But as you may have read at least in this thread, HDR does not equal HDR.
btw, the UHD Premium spec does not ensure a 10 bit panel, just that the TV can handle 10 bit color (per channel) and nothing more. The panel can still just support 8 bit.
 
because "HDR" is a really "cheap" checklist feature, but 10bit panels are not cheap. So if you spent money for a ten bit panel, you should always have HDR onboard (at least the new TVs).
But as you may have read at least in this thread, HDR does not equal HDR.
btw, the UHD Premium spec does not ensure a 10 bit panel, just that the TV can handle 10 bit color (per channel) and nothing more. The panel can still just support 8 bit.

Ultra HD Premium does require the ability to show at least 90 percent of the P3 color gamut, though. And if it can do that, it's either 10-bit or its 8-bit+FRC is doing a good enough job that it makes no effective difference.
 
Opinions about the random incompatibility issues between 4k+hdcp+hdr screens and gaming consoles? Is it the specs are so overly complex, abstract and everyone trying to implement similar firmware code (mostly C,C++ I guess) of their own. Mistakes repeated around the industry and valuable information by the experience is not publicly shared? Manufacturers need to patch firmwares case by case for various 4k playback devices as we have already seen TV brands do.

ps: What was your choice of 4k+hdr television model at home?


imho, the incompatibility is mostly to do with the TV's being sold prior to the release of any major consumer device supporting HDR.
I would expect that next years TV's will be much better compatibility wise with the xboxs and ps4.
Basically i think that the lack of devices that output ANY HDR standard, have been holding back the ability of manufacturer's to implement HDR support in a easy to use manner.
even on our SONY monitor it's a prick to set up. ( though we are still using SDI and don't have the hdmi module....yet)

At home.... I'm still rocking a late gen LG plasma,
Given i am in aus, there is basically 0 4K and HDR content available for me to make use of.
slow inet means no 4K netflix of amazon for me.

Bust most of my screen time is gaming, so I am interested in HDR and the recent consoles, more so than 4K.

-----------------------

I think a lot of people gloss over the signal that the hdmi cable is actually carrying, hint - it's not always RGB!
in order to reduce the bandwidth, often the hdmi cable will be carrying 10bit 4:2:2 YUV data, sure, you get the not pixel perfectness of 422, BUT
the 10 bits, does give you more dynamic range. Ideally 12 or 16-bit RGB would be the best possible format for the hdmi to be sending, but that it VERY unlikely.

We have a Hdmi analyzer here, but i haven't hooked it up to my PS4 to see what it sends out when playing an HDR enabled game, but my guess is 10bit 4:2:2.
Since HDR is so much about brightness and gamma, in some ways 422 YUV data is a better fit for HDR than normal RGB.


Anyway i hope some of you find that info interesting/ helpful.
 
So deep color will automatically active in HDR mode, depending the content source and the TV setting (on my LG deep color is tucked away in weird place). So technically, HDR is not always deep color and vise versa. Despite the reality all HDR contents have deep cooler?

But outside HDR mode, what use deep color?
 
vjPiedPiper, I have a question too. The Sony X300 BVM you own has a motion enhancement feature. How effective is it for 30 FPS games, 60 FPS games, and 24 FPS movies? I don't know the exact operation method, (backlight scanning, strobbing ALA Lightboost, Black Frame Insertion) but I heard it's very effective with motion, and is currently holds a reference when it comes to motion performance among sample and hold displays. Do flickers or brightness drop occur? If so, how much? Thanks in advance.
 
vjPiedPiper, I have a question too. The Sony X300 BVM you own has a motion enhancement feature. How effective is it for 30 FPS games, 60 FPS games, and 24 FPS movies? I don't know the exact operation method, (backlight scanning, strobbing ALA Lightboost, Black Frame Insertion) but I heard it's very effective with motion, and is currently holds a reference when it comes to motion performance among sample and hold displays. Do flickers or brightness drop occur? If so, how much? Thanks in advance.


Absolutely no idea I'm sorry to say. :(
1. we don't have a console or gaming machine hooked up to it, currently it's SDI only.
2. I've never even looked at messing with those settings, mostly i am just selecting the colourspace and gamma settings.

However i have never seen anything that i would consider flicker or birghtness drops?
 
So, I've been now using the Samsung HDR UHD TV for couple weeks and I'm considering returning it, it's 40KU6075, which should apparently be the same as 40KU6300 in the US
Biggest reason I'm considering the return is the fact it supports 4:4:4 only via HDMI1. I first thought it wouldn't be an issue, since 4K BluRays use 4:2:0, but it turns out that it only accepts HDR from HDMI1, too. So now I have to either get a HDMI switch, which isn't convenient at all, or I can't enjoy RGB/4:4:4 image from my PC :mad:

Also, it might be 10bit panel after all, I'm honestly not sure, but at least the test videos I found from http://www.avsforum.com/forum/139-display-calibration/2269338-10-bit-gradient-test-patterns.html show clear difference between 8bit and 10bit gradients (then again, I think the difference between them is clear on my Asus MG248Q too)
 
@Kaotik Or you'll have to get a UHD 4K Audio Video Receiver, and while you're at it pick up a Dolby Atmos Speaker setup too.
 
I still think 3d is worth having in these 4k sets. While 3d might not necessarily be as impressive as 4k HDR in most movies, it is still quite useful for pc gaming, where you can crank the 3d effect way up, and it is less resource intensive than 4k gaming, if I'm not mistaken. I'd also like to know if tridef or nvidia 3d vision can force 3d in zbrush, as that would be quite interesting to try.
 
I still think 3d is worth having in these 4k sets. While 3d might not necessarily be as impressive as 4k HDR in most movies, it is still quite useful for pc gaming, where you can crank the 3d effect way up, and it is less resource intensive than 4k gaming, if I'm not mistaken. I'd also like to know if tridef or nvidia 3d vision can force 3d in zbrush, as that would be quite interesting to try.

Can't speak to how well it works for games, but 3D for movies always looks like an animated "Viewmaster" to me with objects in a scene being flat planes at different levels of depth. I've never found the look very appealing.
 
Back
Top