1080p HDR image better than 4k non HDR ?

Well, I now have a 4k HDR Screen. Forza Horizon 3 looks fantastic. But I'm not sure if it is because of the HDR feature, or because of the 10 Bit color-output
There are much more details visible on the car, tracks etc.
Both helps, the expanded contrast and color would reveal tons more subtle fine grain texture details and gradation, not to mention an overall more saturated and vibrant picture to match the reality that much closer. Once you go HDR you don't wanna go back.
 
Both helps, the expanded contrast and color would reveal tons more subtle fine grain texture details and gradation, not to mention an overall more saturated and vibrant picture to match the reality that much closer. Once you go HDR you don't wanna go back.

this is where i got confused with HDR. in reality, the world is dull most of the time. at least seen from my eyes...
those amazing nature pictures are never like that when i go there by myself. reality is less vibrant, less saturated.

HDR Pacific Rim do look better than SDR tho, when seen on my shitty LG HDR 4K TV. For FF XV, HDR mode just make the picture too cartoony.
 
this is where i got confused with HDR. in reality, the world is dull most of the time. at least seen from my eyes...
those amazing nature pictures are never like that when i go there by myself. reality is less vibrant, less saturated.

HDR Pacific Rim do look better than SDR tho, when seen on my shitty LG HDR 4K TV. For FF XV, HDR mode just make the picture too cartoony.
A few factors to consider. The lighting in certain times of the bright sunny day do bring out the colors a lot more such as sunset and sunrises. The clarity of the ambient environment also contributes to it, absence of dust, fog, shadows etc. Bear in mind sunlight has way more than 10000 nits thus the color volume illuminated on the surroundings should be way more vibrant rather than dull. Professional photographers and cameramen go through a great deal of prep time to find the right amount of light, timing and colorful subjects in the most ideal angle to take their shots therefore creating an perception that nature is so damn colorful. I bet an amateur with an iphone would create a completely different look on the same subject. But of course I understand, more often than not, scenery tend to look dull in relation to photos due to the aforementioned factors.
 
this is where i got confused with HDR. in reality, the world is dull most of the time. at least seen from my eyes...
those amazing nature pictures are never like that when i go there by myself. reality is less vibrant, less saturated.

HDR Pacific Rim do look better than SDR tho, when seen on my shitty LG HDR 4K TV. For FF XV, HDR mode just make the picture too cartoony.

That's because you also need contrast ratio in conjunction with HDR. Heck, I would even go as far as to say contrast ratio is even more important than dynamic range (HDR) in real life rendition. My Panasonic plasma TV may be SDR only, but its picture looks still closer to real life than any HDR TVs I've seen and that includes the Sony Z9D. Usually, dynamic range and contrast ratio goes hand to hand, but not always. There are two ways to measure contrast ratio : sequential and simultaneous.

1) Sequential contrast ratio is obtained by dividing the brightest luminance the display can attain at one time, with the darkest luminance the display can attain at one time. In this scenario, the brightest white and the darkest black don't have to be together on the screen at the same time. Dynamic range can also operate sequentially, so in this scenario, both contrast ratio and dynamic range are interchangable.

2) Simultaneous contrast ratio is obtained by dividing the brightest luminance with the darkest luminance at the same time. Another name for simultaneous contrast contrast ratio is ANSI contrast ratio because when measuring simulataneous contrast ratio, you put out ANSI checkerboard pattern (alternating black and white) with varying APL, but the correct way to measure it is with 50% APL. (50% of area is covered with full white, 50% area covered with complete black)


I'll use an audio analogy to explain. Dynamic range is also very, very important for audio. Sequential contrast ratio would be like Haydn Symphony No.94 'Surprise' or any pop ballads when they start off being quiet than suddenly explodes. Simultaneous contrast ratio would be playing quiet flute and loud gong at the same time. A very good pair of speakers will have no problem rendering the two instruments accurately at the same time.

CRTs are truly weird beasts. Their sequential contrast ratio can get easily over 8000:1 yet their ANSI contrast ratio is only 50~100:1. To make up for their ANSI contrast ratio deficiency, CRTs are extremely fast in sequential movements so we can see jumps in luminance in very smooth, very animated manner. This is why we still fondly remember CRTs despite their shortcomings.

Simulatenous contrast ratio is further divided into two methods: a) native and b) zone assisted

a) Native : Measured ANSI contrast ratio at pixel level

b) Zone assisted : Measured ANSI contrast ratio at two divided zones


Plasmas and OLEDs are (a) types, CRTs and Full Array Local Dimming (FALD) LCDs are (b) types. It's very important to distinguish between the two.

Most of today's displays mix Red, Green, and Blue to create billions of colors. That's why it's desirable to output pure, unmolested colors. But such is not possible with a display with poor black level. Suppose you want to output the most pure blue color. Then, unwanted white light enters into such blue colored pixel. What would happen with that blue colored pixel? It gets diluted to be lighter. Purity and saturation lost. And what determies the amount of said unwanted white light? The display's minimum black level, AKA black level. This is why when it comes to display, it's important to remember black as not just another color, but as an absence of light. Thinking as former will make a lot of people understand good black level only useful for black and shadow detail. No, it's important in every colors, including bright daylight scenes. Think of banana, blueberry, and strawberry juice and what would happen to their colors if some milk is added. Same thing. It gets diluted and it will get diluted the more when more milk is added. So think of black level as not black, but as an white dilution, as ironic as it might sound at first. The lower the black level, the lower the white dilution.

Zone assisted contrast ratio, on the other hand, is not defined at the pixel level. It requires two physical pixels or areas put together adjacently. And it doesn't care whether those two adjacent areas come from same panel or two different micro panels. To understand why FALDs are used on LCDs, one needs first understand the black level stability of displays. For OLED displays, black level luminance output at given pixel level do not change at all with regards to any luminance changes in other pixels. The LCDs, being a transmissive displays, do change. For this reason, Samsung/Sony S-LCD's 0.03 cd/m2 black level figure is actually not the lowest it can go. It can go even lower in very low light output. 0.03 cd/m2 figured is obtained when the brightest pixel of such panel is at most 120 cd/m2. However, the reverse is also true. The brighter the panel goes, the black level also rises. That's why multiple micro displays are used, so that areas that has low luminance can let a zone to prioritize black level, while adjacent areas that has high luminance can let another zone to prioritize peak brightness. This kind of approach will allow an LCD TV to have contrast ratio much greater than a single panel.

Suppose an LCD panel has 3000:1 contrast ratio at any brightness levels for simplicity's sake. Putting two LCD panels will allow total cumulative contrast ratio to be doubleded, 6000:1. However, native contrast ratio will never ever be calculated in a cumulative manner. It will always treat it as two separated 3000:1 contrast ratio panels. And when it comes to real life-like rendition of real world, the native figure is always more accurate than zone-assisted. But the zone-assisted contrast ratio does bring out more impactful picture, along with higher dynamic range. Unlike contrast ratio, dynamic range doesn't care whether it's native or zone-assisted.

So it's possible to have high dynamic range with low (native) contrast ratio, like what FALDs do, but they will never be able to render real-life. It's also possible to have high (native) contrast ratio with low dynamic range. Simply increasing desired luminance output will also compress dynamic range akin to "Loudness War". SDR contents have optimal dynamic range at 100 cd/m2. When viewing at 200 cd/m2, you're compressing 50% of dynamic range, when viewing at 400 cd/m2, you're destroying 75% of dynamic range.

So, it's important to think of HDR not just as higher brightness, not just as lower dark area, but also the range between the two. Having wider lumiance range gives artists so much freedom to play with their creativity which wasn't possible with SDR. Some Samsung KS8000 owners mistakenly think that because their TVs are brighter than OLEDs in HDR contents, they have superior HDR performance. They're dead wrong. An ability to put out the highest luminance is important. But so is the ability to map out so many varitions of luminance in between, and the KS8000 simply can't compared to an OLED TV. Now, a low brightness OLED TV is also not ideal. Color volume is mapped 1:1 to color gamut so even though an OLED TV that can do 100% DCI/P3 color and 1000 cd/m2 peak brightness will lose 75% of said color performance when playing 4000 nit materials, but will lose nothing at all when playing 1000 nit contents.
 
Last edited by a moderator:
Yups. That's also why my older SDR HDTV looks muuuuuch better than my 4k HDR TV.

My older TV not only have higher contrast ratio but also wider color gamut.
 
the world is dull most of the time. at least seen from my eyes...
those amazing nature pictures are never like that when i go there by myself. reality is less vibrant, less saturated.
Correct
but its picture looks still closer to real life
Thus looks pretty crap.

The thing is and why orangpelupa those HDR pictures look better than reality is our eyes can get overwhelmed, thus a bright cloudy sky etc will make it impossible in reallife to see the darker scenes in your vision well. HDR done well balances the whole scene, enabling you to pick out the details more. As I have said for many years on these forums reality often looks pretty boring and dull we need uber reality, its part of the reason why theres major ass lighting when professionals make a movie or take pictures
 
Well reality is not composed by artists, like Movies or paintings.

Pacific Rim is composed for dramatic effect, or a Turner or a Sorolla for instance.
 
Correct
Thus looks pretty crap.

The thing is and why orangpelupa those HDR pictures look better than reality is our eyes can get overwhelmed, thus a bright cloudy sky etc will make it impossible in reallife to see the darker scenes in your vision well. HDR done well balances the whole scene, enabling you to pick out the details more. As I have said for many years on these forums reality often looks pretty boring and dull we need uber reality, its part of the reason why theres major ass lighting when professionals make a movie or take pictures
Best solution. Legalize Shroooms and LSD!
 
Correct
Thus looks pretty crap.

The thing is and why orangpelupa those HDR pictures look better than reality is our eyes can get overwhelmed, thus a bright cloudy sky etc will make it impossible in reallife to see the darker scenes in your vision well. HDR done well balances the whole scene, enabling you to pick out the details more. As I have said for many years on these forums reality often looks pretty boring and dull we need uber reality, its part of the reason why theres major ass lighting when professionals make a movie or take pictures

HDR pictures look better than reality? Only in our dreams! Even with OLED, HDR still has a long way to go matching incredible dynamic range when we see car headlights in dark night, road drenched in rain reflected with traffic light, deep sea water with wave, and searchlight. Those are not boring at all. Yeah, real life can look boring for majority of times, but there are some scenes that look increadible, and even in many scenes with great dynamic range, we're not particularly feeling it because we got used to it already. I started to notice them the more since I started using my Panasonic plasma. And my plasma's picture is anything but 'pretty crap'. Given opportunity, it can bring out exceedingly exciting lighting and particle effects in games. I'm living in someone else's house right now, so now I'm stuck with a Samsung LCD, but despite its brightness advantage, LCDs look dull for the most of times.

Being able to render such boring 'subdued' colors is not an easy part. It would mean object texture roughness is very high (meaning many varying degrees of texture surfaces), so when incidental light hits the surface, multiple lights will be reflected with each having different degree of reflection, so there will only be few lights reflecting to our eyes and our eyes think of this light output loss from incidental light as 'subdued color'.

If surface becomes more uniform towards mirror reflection, then there are chances we can see more exciting colors. Shiny glossy objects like glossy laptops have both mirror surface and rough surface. There will still be lights reflecting in random directions due to rough surface present, but they will be more towards our eyes and combined with mirror reflection, our eyes sense stronger light due to less light output loss.

One reason Dolby has set our current HDR standard as 10,000 cd/m2 is because of specular highlights. Even in a glossy object, some part of it is more diffuse, some part of it is more glossy. The Phong model takes into account of diffuse component and glossy (which is also known as specular) component. Usually, specular component has less surface coverage than diffuse component, but specular component will be always brighter. Because specular surface has small coverage area, brightness can be increased considerably in such specular highlight area without giving discomfort to people. 1,000~10,000 cd/m2 sounds like an overkill at first, but in only limited area, it's not. Total APL increase is actually very small. HDR10 version of Mad Max can go up to 1000 cd/m2 in specular highlight, but average luminance for the entire run is actually only 120 cd/m2. That means only 20% more bright than SDR Bluray.

The reason people are finding HDR pictures too bright and cartoony is because some ofthose so called HDR TVs are doing very poor job with it. Yes, I'm talking about the Samsung KS8000, the most popular and also the most overrated TV of 2016. Because it's an edge-lit TV with only 10 dimming zones, (two row) it may have high luminance to render specular surfaces, but it can't dim itself enough for diffuse surfaces, so APL for specular highlight luminance is too high, making it overly saturated and cartoony. It also has a 7 seconds delay from 520 cd/m2 to 1400 cd/m2, meaning quite a lot of time, specular highlight will remain in high APL surface. Yes, it sure will be exceedingly bright due to 520 cd/m2 of brightness covering almost 100% of screen surface, but is it anything 'dynamic'? Hardly, yet some KS8000 owners have audacity to say their TVs have better HDR performance than OLEDs simply because it's brighter.

Having luminance higher than its designated luminance has a negative consequence. It's the same thing with SDR too. When you watch any SDR contents designed to be watched in only 100 cd/m2 in 200 cd/m2, you will compress its dynamic range by half, 75%, when watching it in 400 cd/m2. Yes, I know. 100 cd/m2 sounds sucks at first. But I first started with a Sony LCD that I watched in 200~300 cd/m2. At that time, I was mesmerized by impactful brightness but it felt flat and dull at the same time too. Then I moved to my Panasonic plasma and despite having to watch it only in 70~90 cd/m2, luminance movement was so much animated because 100 cd/m2 dynamic range was mostly kept. Same for games too. When put in 100 cd/m2, lighting animation feels much more rich than putting in 200 cd/m2 and higher. Watching an SDR content in 200~400 cd/m2 is like listening to badly compressed rock music. Watching it in 100 cd/m2 is like listening to very quiet classic music, yet it's actually classic music that on average as vast superior dynamic range than rock music. Loud alone does not mean dynamic range in audio. Bright alone also does not mean dynamic range in video. Loudness war hurts our ears and the now our eyes are getting fatigued by the same thing.
 
HDR pictures look better than reality?
just took a photo on my smartphone (no flash, just HDR)
With my eyes I could not see the shoe there I just see a black shape, in the photo I not only see its a shoe but its brown!
photo.jpg
 
just took a photo on my smartphone (no flash, just HDR)
With my eyes I could not see the shoe there I just see a black shape, in the photo I not only see its a shoe but its brown!
photo.jpg

Yeah HDR photography is the total opposite of HDR lightning in games and more inline with HDR TV.

Thus making us easier to see more details (not just object but also the colors).

Although some phones have pretty shitty hdr with saturation boosted too much...
 
Yeah HDR photography is the total opposite of HDR lightning in games and more inline with HDR TV.

Thus making us easier to see more details (not just object but also the colors).

Although some phones have pretty shitty hdr with saturation boosted too much...
Would love for them to do proper HDR photos.

Currently they take couple of images with different shutter speeds and tonemap the result to 8bits.
Instead they could take those 2 images and use the information to extend the dynamic range and store the result in .DNG.

HDR photos are exactly same as any other HDR, until artist comes along and fuck it up by tonemapping the result to hell and back. (Take 20 photos and combine result to have huge dynamic range and then crush it to mess.)
 
Last edited:
These Dell 24" and 27" monitors are probably the closet thing you'll get to a 1080p HDR display:
Sony has announced that their entire 2017 TV line will support HDR, even [few] Full HD models that they offer. But nobody has seen those, who knows how will that look.
 
Sony has announced that their entire 2017 TV line will support HDR, even [few] Full HD models that they offer. But nobody has seen those, who knows how will that look.
Well HDR support != HDR support ;)
If the TV can process the signals doesn't mean you see any visual advantage.
HDR is a checklist feature even for the cheapest TV now. That's at least why some TV companies have named "real" HDR processing "premium HDR" (LG calls it "Super HDR") but even this does not mean your TV has a 10 bit panel. Well mine has 8bit + FCR which really looks great. But maybe my older TV was just not that good.
But the new TV totally got rid of color-banding in games which is really really nice.
 
Earlier in this thread there were comments on blacks in HDR, HLG versus Dolby Vision, 10bit PQ or more.

"4000nits come from Dolby Pulsar display"

http://www.lightillusion.com/uhdtv.html
Here many points are addressed, problems with "current" HDR certification...

My additions by reading Dolby, BBC, Philips papers. And BT.2020, BT2100....

Displays could have 12bit banding performance if the blacker-than-black/superwhite reserved codeword weren't expanded by 4 times. (Sensor calibration reserve could use fewer codes)

XYZ wants to milk money by hurting these HDR standards such a way That reminds me of the whole interlaced patent war/ legacy support;

Game VRR was implemented and tested with firmware only mods in HDMI 1.4. Widespread reserve for 2.1 revision is another milking strategy;

Sent from my Moto G using Tapatalk
 
Personally I don't know how 4,000-10,000 nits displays are going to be acceptable. My ZD9 is the brightest TV out as of today, and those times where it pushes it maximum 1,800 or so nits in HDR - flashlights, car lights, any super bright lights you can think of that have been mastered in the material at the appropriate light level - it's already wayyyyy too bright. It's literally like you've been flashed a bright LED flashlight in your face and it's almost unpleasant.
I don't see how 4,000 to 10,000 nits can be desirable, from what I can see at around 1,800 nits.
 
Personally I don't know how 4,000-10,000 nits displays are going to be acceptable. My ZD9 is the brightest TV out as of today, and those times where it pushes it maximum 1,800 or so nits in HDR - flashlights, car lights, any super bright lights you can think of that have been mastered in the material at the appropriate light level - it's already wayyyyy too bright. It's literally like you've been flashed a bright LED flashlight in your face and it's almost unpleasant.
I don't see how 4,000 to 10,000 nits can be desirable, from what I can see at around 1,800 nits.
From the links in my earlier post...
The average picture level will remain 100nits. So only highlights benefit in the way pq2084 and 10bits code words were chosen.

My opinion is content creators and the console industry has to pressure for a good standard in this way what we call 10bit wcg HDR will change in less time than promised. Increasing costs for creators and worse shrinking the users "compatible" market.
dd0df0e66f8d1f45400752810e0294a1.jpg
202194b04570029cd73e91f4a4fef659.jpg


Sent from my Moto G using Tapatalk
 
Back
Top