Do we know any specifict or estimate date for HDR monitors?

xEx

Veteran
So im waiting for them to come out but I really can't find any info about them. maybe someone here have some info.
 
They're digital signage displays, not really consumer PC monitors.
 
Very interesting use of overpowered backlights and ambient light sensors.
I had no idea these existed but the use case for individuals would be if you're well off (or decided to spend a number of grands on such atypical display) and want to watch TV or movies in your bright sunny backyard in the middle of summer. Which is in fact the opposite conditions of what HDR content demands.
 
A 5k nit display is going to blast the hell out of peoples' eyes. Completely overgunned for any normal person. Also, what's the power draw at such a light output level; a kilowatt? :p Ugh, no.

High-powered quantum-dot LCD screens are already way too bright IMO; my Apple Thunderbolt is specced at like 350-400nit IIRC, and at max brightness you don't want to be looking straight at it for any period of time. Samsung has a quantum dot TV specced at 1000 nits or thereabout I believe; jesus. That'd be horrible to look at indoors in room lighting.
 
A 5k nit display is going to blast the hell out of peoples' eyes. Completely overgunned for any normal person. Also, what's the power draw at such a light output level; a kilowatt? :p Ugh, no.

High-powered quantum-dot LCD screens are already way too bright IMO; my Apple Thunderbolt is specced at like 350-400nit IIRC, and at max brightness you don't want to be looking straight at it for any period of time. Samsung has a quantum dot TV specced at 1000 nits or thereabout I believe; jesus. That'd be horrible to look at indoors in room lighting.

The point of HDR monitors with high maximum light output isn't to have it output that much light at all times like a standard monitor. But to have the capability to do so if the content calls for it.

Currently, you set a monitor to a certain brightness to make sure that typical desktop tasks (browsing the net, editing documents, etc.) are comfortable for extended periods of time. But that also limits the maximum brightness that can be displayed during a game or video playback. With an HDR display you still configure your display for extended comfortable usage, but it allows it to exceed that if the content calls for it.

Regards,
SB
 
A constant powerful glare is better for the eye than a sudden burst of enormous brightness, because your pupils are slow to react. So a 5k nit display would indeed blast the hell out of your eyes, especially if it suddenly zaps all of it into your retinas without prior warning! ;)
 
Currently, you set a monitor to a certain brightness to make sure that typical desktop tasks (browsing the net, editing documents, etc.) are comfortable for extended periods of time. But that also limits the maximum brightness that can be displayed during a game or video playback. With an HDR display you still configure your display for extended comfortable usage, but it allows it to exceed that if the content calls for it.
While I agree the main use case is games, videos and photos (i.e. trying to get closer to reproducing real world brightness/contrast), desktop tasks could also benefit somewhat from a larger range. For various historical reasons, UI designers (or those who had to fill that role without proper training) often end up using close to maximum brightness of LDR ("white") as a background colour filling most of the screen. Unfortunately that means all the actual content must be darker and you lose the ability to actually highlight something on it.
 
A constant powerful glare is better for the eye than a sudden burst of enormous brightness, because your pupils are slow to react. So a 5k nit display would indeed blast the hell out of your eyes, especially if it suddenly zaps all of it into your retinas without prior warning! ;)
5k nits maximum brightness is not a problem. Bright highlight areas tend to be small, such as specular reflections on metal edges. Eyes handle this case just fine. If you accidentally look at the sun or a big mirror like surface reflecting the sun, it should look uncomfortably bright. You automatically close your eyes in this case, just like you do with the real sun. Eyes adapt pretty fast to bright light (but much slower to darkness). However 5k nits is nowhere close to displaying the brightness of the real sun or even a white surface in direct daylight. A small sun disk on the screen at 5k nits wouldn't be bright enough to cause the same reaction as accidentally looking at the real sun.

Of course the game developers need to carefully design their HDR game content in a way that doesn't cause sudden very bright full screen flashes. Games contain futuristic weapons with bright projectiles and explosions. For example a electric discharge (or plasma explosion) seen by a human being by close proximity would likely blind him/her in real life. HDR games need to limit the brightness of these effects. People are used to dealing with realistic scenarios, such as reflective surfaces and bright sun light. A good physically based renderer simulates these things realistically, so there are no surprises.
 
What I'm concerned about is game designers overstating and abusing their new toy - as tends to be the case often whenever new checkbox features become available/feasible, like lens flares, DoF, chromatic aberration and you can probably think of a few other examples that have cropped up over the years. :) How do you show off superbright capabilities of your game engine on HDR monitors? By pumping the brightness in a not terribly subtle manner, I'm thinking.
 
How do you show off superbright capabilities of your game engine on HDR monitors? By pumping the brightness in a not terribly subtle manner, I'm thinking.

Lots of real life situations come to mind. The shiny glare off metal like Sebbbi mentioned, for instance. A flashlight/search light/spotlight to the face by a security guard/tower in a stealth game comes to mind. Driving at night in a car and the occasional oncoming traffic is another. Explosions. Flash Bangs, carefully balanced, can make it slightly uncomfortable to look at, in addition to just turning your screen white. Games can make "sparklies" like in "I Am Setsuna" more noticeable to the player without making them larger, a benefit as many people don't notice those. Sunlight glinting off the surface of water could definitely benefit hugely from this.

J.J. Abrams style lens flares are likely to get a bit more annoying, but they were already annoying in games anyway. :p

Basically this is something that's needed if you want lighting to take another large step forward in games. Yeah, some developers are going to get it wrong, but that doesn't mean you should stop progress. And since developers still have to support LDR monitors, there's likely to be a LDR/HDR toggle in settings allowing you to disable it if you have an HDR monitor but don't like their HDR implementation.

Hell, I'm currently watching the Olympics on TV and I can immediately notice that lights that should be bright and vibrant instead look dull and lifeless, and this is with a properly calibrated TV.

Regards,
SB
 
HDR is the biggest jump in video quality since FHD and it would be really impressive in some years when Dev. learn how to squeeze all of it capabilities but! first we need HDR screens/monitors and not only 5k dollars TVs.

I still havnt find any news about when this displays will be available and thats makes me sad :(


How do you show off superbright capabilities of your game engine on HDR monitors? By pumping the brightness in a not terribly subtle manner, I'm thinking.

Fireworks is a perfect example.
 
Back
Top