Capturing screenshots in HDR games? *spawn*

A mate of mine told me that you can tone down HDR in some consoles. I mean, it is going to be overly bright if it is set to.... say 1000 nits on a HDR600, HDR500, HDR400 screen.

Yes, Gran Turismo have exactly that and few months back Sony added system wide settings (not supported much, maybe new games will), but this settings only ensure that game doesn't send data which TV can't display, it doesn't darken the picture (at least in GTS).

If you look at this analysis of Forza 4 you can see the problem with HDR, huge parts of screen are pushed beyond 100 nits, even HUD elements. Sure it looks bright and nice, but it goes against intention behind HDR, which is quite clear when half of that 10 bit precision is alocated to 0 - 100 nit range.
( https://hometheaterhifi.com/technic...mic-range-hdr-explanation-dolby-vision-hdr10/ )

Other thing is how displays are handling HDR. I mean if half of 10 bit HDR precision is below 100 nits, are displays even capable of delivering 512 steps below 100 nits if they max out at 1000 nits for LCD and 500 nits for OLED?
 
Last edited:
Yes, Gran Turismo have exactly that and few months back Sony added system wide settings (not supported much, maybe new games will), but this settings only ensure that game doesn't send data which TV can't display, it doesn't darken the picture (at least in GTS).

If you look at this analysis of Forza 4 you can see the problem with HDR, huge parts of screen are pushed beyond 100 nits, even HUD elements. Sure it looks bright and nice, but it goes against intention behind HDR, which is quite clear when half of that 10 bit precision is alocated to 0 - 100 nit range.
( https://hometheaterhifi.com/technic...mic-range-hdr-explanation-dolby-vision-hdr10/ )

Other thing is how displays are handling HDR. I mean if half of 10 bit HDR precision is bellow 100 nits, are displays even capable of delivering 512 steps bellow 100 nits if they max out at 1000 nits for LCD and 500 nits for OLED?
do you mean that Forza Horizon implementation is not that good? Btw, thanks for the video.

As for half of HDR being behind 100 nits, afaik on a 1000 nits display, the blue color is like 80 nits, which is extremely vibrant for that color.

On a different note, another reason not to get a 4k high refresh rate monitor -at least nowadays-....

This guide helps you set your monitor to the 10bit color depth typical of HDR and at 4k you can't have that with high refresh rate.

https://www.ign.com/articles/2019/09/16/monitor-calibration-how-to-calibrate-monitor
 
Does the clipboard save images with HDR?
If you only need HDR screenshots under win10, try the windows game bar. It will capture HDR pictures and HDR can be shown in win10 photo viewer natively (may wait for few seconds for app loading HDR metadata
 
do you mean that Forza Horizon implementation is not that good? Btw, thanks for the video.

I don't know, is sky in game supposed to be in 1000 - 2000 nit range, is it good for eyes when you play?

As for half of HDR being behind 100 nits, afaik on a 1000 nits display, the blue color is like 80 nits, which is extremely vibrant for that color.

Not really what I meant, I was curious if 10 bit panels can display range 0 - 100 nits with enough precision when they are supposed to do 1000 nits at the same time. One thing to note is that it's not 512 steps in 0 -100 nit range, it's 447 (black = 64 not 0), other thing is that I managed to download datasheet for IPS LCD panel LD650EQE-FJA1 from LG displays (4k, 10 bit, 500 nits) and grayscale specification is:

Gray Level - Luminance [%] (Typ)
L0 - 0.083
L63 - 0.27
L127 - 1.04
L191 - 2.49
L255 - 4.68
L319 - 7.66
L383 - 11.5
L447 - 16.1
L511 - 21.6
L575 - 28.1
L639 - 35.4
L703 - 43.7
L767 - 53.0
L831 - 63.2
L895 - 74.5
L959 - 86.7
L1023 - 100

100 nits is 20% of maximum 500 nits, which makes some 500 steps in 0 - 100 nit range available, which is good enough, but if the same greyscale specification is used for 1000 nits panel it would make only 370 steps available in 0 -100 nit range, which is not so good.
Anyone have an OLED panel or 1000 nit LCD datasheet? I would like to take a look.
 
I don't know, is sky in game supposed to be in 1000 - 2000 nit range, is it good for eyes when you play?



Not really what I meant, I was curious if 10 bit panels can display range 0 - 100 nits with enough precision when they are supposed to do 1000 nits at the same time. One thing to note is that it's not 512 steps in 0 -100 nit range, it's 447 (black = 64 not 0), other thing is that I managed to download datasheet for IPS LCD panel LD650EQE-FJA1 from LG displays (4k, 10 bit, 500 nits) and grayscale specification is:

Gray Level - Luminance [%] (Typ)
L0 - 0.083
L63 - 0.27
L127 - 1.04
L191 - 2.49
L255 - 4.68
L319 - 7.66
L383 - 11.5
L447 - 16.1
L511 - 21.6
L575 - 28.1
L639 - 35.4
L703 - 43.7
L767 - 53.0
L831 - 63.2
L895 - 74.5
L959 - 86.7
L1023 - 100

100 nits is 20% of maximum 500 nits, which makes some 500 steps in 0 - 100 nit range available, which is good enough, but if the same greyscale specification is used for 1000 nits panel it would make only 370 steps available in 0 -100 nit range, which is not so good.
Anyone have an OLED panel or 1000 nit LCD datasheet? I would like to take a look.
why does 20% of 500 nits make 500 steps in the 0 - 100 nit range? I am curious, since I dont know the maths used in the calculation.

On a different note, I took some pictures in HDR and SDR mode on my monitor. Maybe not the bests pictures ever but... Forza Horizon 4:

HDR (subtle, the shadowing is more noticeable. but this is only to show how easy is to enable/disable hdr in the game)
K9vdkUm.png


SDR
jM1IpHN.png


---------------------------------------------------------------------------------------------------------------------

HDR
i66Vkgt.jpg


SDR
38gDPLT.jpg
 
why does 20% of 500 nits make 500 steps in the 0 - 100 nit range? I am curious, since I dont know the maths used in the calculation.

look at example of greyscale specification I posted, L511 - 21.6%, for 20% is approx. 500
 
surprise me colored! I installed GoG Galaxy 2.0 today just to try it out, and well, I decided to play some GoG games of my collection. I turns out that Fear 2 has HDR! :oops::oops: Fear games have always been ahead of their time, imho. They still look fine today. I am playing the game at 165fps xDDDD.

However, in doing so, the aim sensitivity is sooooo high that I had to set it to the minimum value, yet the aim is so fast, only a little more playable. (165fps in this game is not so fun due to that, though, the sensitivity leads to motion sickness)

That being said, kudos to the developers, I guess they are working in other studios successfully nowadays, they are very talented people. Such a pity the sound is so bad... Fear 1 has better sound imho -but no HDR-.
 
Last edited:
oh well, that explains a few things. It looks more convincing that turning it off though, especially the neon lights in buildings that draw thin neons on the facade of a building. I didnt play more 'cos of motion sickness -gotta see if I can reduce aim sensitivity in the Xpadder game profile- but back then HDR was kinda new to embellish the lighting of a game.

It's incredible how computers could barely run the game at HD and decent framerates with AAx4 or more, HDR, and high aniso, and nowadays, with everything maxed out on a GTX 1080 the game runs at almost 200fps and the gpu runs it like a breeze in a way you can't hear the fans running, it doesn't get hot. The particles and fire were top notch at the time -Maximum settings-, but having played Wolfenstein Youngblood, they look like a joke nowadays. The fire is low res and the particles are nothing to write home about. Gotta admit I was impressed with both 4-5 years ago when I got the game on GoG, but nowadays...ugh
 
Series X could add automatic HDR to all the backwards compatible games with a new AI.

https://www.techradar.com/news/the-xbox-series-x-will-be-the-go-to-console-for-hdr-gaming-heres-why

Old dog, new HDR tricks

During Digital Foundry’s visit to Microsoft’s HQ in Redmond, Washington, the team was shown the Xbox One X enhanced version of Halo 5: Guardians, running with a very convincing HDR implementation. Even though developer 343 Industries never shipped the game to include HDR support originally, Microsoft has found an ingenious way to add HDR into the game – developed from the state-of-the-art HDR implementation used in Gears 5.

Microsoft ATG principal software engineer Claude Marais revealed that, by using a machine learning algorithm, the team was able to generate a full HDR image from SDR content – on any backwards compatible title.

And when Microsoft says any backwards compatible title can receive the HDR treatment – it means any title. The Digital Foundry team was stunned to see Fusion Frenzy – an original Xbox game that was released almost 20 years ago – running with real HDR.

Microsoft’s new HDR-mapping tech will extend across the entire Xbox library on Xbox Series X, and apply to the hundreds of compatible games that don’t have their own bespoke HDR modes already.

Some TV displays will ship with baked-in HDR-effect or HDR boost options – which aren’t actually HDR proper. But Microsoft’s technology creates exact heatmaps (instructions for the brightness settings) for the auto HDR tech to work from, ensuring the picture looks just as it should.
 
I think where the confusion comes from is that while there are extensive standards for HDR video (HDR10, DV, HLG), there is still no standard for HDR pictures in the sense of what we have with those video standards.

HDR pictures for many years now have meant those ones that show a huge range of exposures (just like your phone can do). This is not what HDR video does, as we all know. On top of that, games have for many years now been rendering in HDR internally, but only with the last gen we have been able to output real HDR10 to our TVs, on selected games.

So from now on i'll call them “HDR10 pictures” so that we don't get things confused - HDR10 pictures do not exist, for full clarity.

You can't take a picture of "something HDR" and hope to get an HDR10 picture out of it, because the camera itself cannot capture the range, and therefore you cannot display the full range that we now expect from HDR10.

I think you can take a screengrab of an HDR game on the X, to be displayed on the X and through an HDR TV. That probably works because it's its own system.

But you simply can't take a picture off screen with your phone or camera and hope for that to be an HDR10 picture.

If that were the case, you would already be able to take "HDR10" pictures of reality, which is of course as "HDR" as it can get.

I think 'proper cameras' have only recently started to get sensors that can capture more range, and in fact I know that some Panasonic and Sony cameras can already shoot VIDEO in HDR10 and/or HLG (because their sensors and software works with 10-bit encoding or higher). But I'm pretty sure that, unless something has changed very recently, those same cameras cannot take "HDR10 pictures" as there simply isn't an HDR10 standard for pictures.
 
Last edited:
Resident Evil 7 with HDR -top image-. Image taken with the Xbox app for PC, which generates a JXR file with the HDR metadata. Then I snipped the image with the Snip & Sketch tool so I could export it to a PNG.

NlRzDk8.png


Resident Evil 7 without HDR -same image as above, in this one I disabled the HDR of the monitor and the HDR setting of the game-

wRTMgHg.png
 
btw, it doesnt matter the game you play. Just set the HDR brightness to the max, as if your TV/monitor had 10000 nits. It doesn't matter, your TV/monitor will perform the corresponding tone mapping and the true brightness in nits of your screen is what will be shown.

 
Last edited:
a very very interesting article on -imho- the best game changer in the last 20 years when it comes to graphics -HDR-. The real brightness of the sun is 1,600,000,000 and the night's dark is around 0,001 nits.

They mention that Dolby made a study on how many nits real life colors have, and here is a sample of a image from Dolby -the image looks a bit broken 'cos I captured it on the monitor with HDR on (I always have it on, best practice) and the Snip app only captures SDR content, hence the result-.

gDox67H.png


https://www.cnet.com/news/tvs-are-only-getting-brighter-but-how-much-light-is-enough/
 
wowwww, the new HDR patch for the PC version of Ori. It was worth the wait.

Photos don't make it justice in my mediocre phone, but in person the difference is impressive. Like day and night! The brightness, the colour, the contrast!! The photos don't show the same as how it feels in person but it helps to give you a glimpse. The difference in person is much more abysmal.

HDR OFF
2bb200b.jpg



HDR ON
e16fMBU.jpg
 
Last edited:
soooo it turns out that Firefox does not support HDR and when playing a HDR video on youtube even if you have HDR enabled on the monitor, youtube plays it as an SDR video.
This allowed me to make this real-time comparison between the two formats using the Xbox app that does capture HDR.

The Xbox app creates a .jxr file and a .png file. That png is based on the HDR colour space and is the closest thing to reality. However, it loses the intensity of HDR lighting (which is kept intact in the jxr format) and does not do it justice, buuuuuuut serves to illustrate this example.

Differences are very abysmal in some cases, but I insist that in real-time HDR on a TV or monitor it is more pronounced given the fact that the light -black and white-, contrast and colour are all more intense.

For a more close to reality experience, the images should be watched on a HDR TV or monitor, on SDR the HDR pictures become even more dull than transforming them to png did to those images.

WITHOUT HDR (Firefox)

STTj0vt.png


WITH HDR (new Edge)

yMLCh1a.png


WITHOUT HDR (Firefox)

1jbziR6.png


WITH HDR (new Edge)

ysm5cJY.png


WITHOUT HDR (Firefox)

DnAf8Bj.png


WITH HDR (new Edge)

efRl8f6.png


WITHOUT HDR (Firefox)

Cu5igGL.png


WITH HDR (new Edge)

V2VRG4V.png


The video from which I got the screengrabs.

 
Last edited:
it's quite disappointing that Halo Reach has a line in the ini files to enable HDR but it doesn't work. If it was another title... But Halo Reach.... It has the HDR built in, even in the original version. It was primarily made with it in mind.

I still remember all the fuss about the use of the eDRAM to offer the best possible image with HDR both in Halo Reach and Halo 3, and in fact they say it was the main reason both games were under HD.

The explosions even on the TVs and monitors of the time looked super bright and the effect lingered for a while.

What do you mean it doesn’t work?

HDR was available for the 360 and PS3 under a different context.

My memory is horrible but HDR pre 4k TV feature was related to fp format and bloom.
 
What do you mean it doesn’t work?

HDR was available for the 360 and PS3 under a different context.

My memory is horrible but HDR pre 4k TV feature was related to fp format and bloom.
I meant on PC. The Halo MCC collection supposedly has HDR, but it is disabled and there isn't a HDR option available. For a game like Halo Reach, which had a great solution for HDR taking advantage of the X360's eDRAM it would be a perfect setting to have.

New video about the closest to my dreamt monitor display on the market. It's QLED.

 
Last edited:
Anyway, relevant to this thread, Firefox and Chrome are adding AVIF support. AVIF is of course the AV1 still image file format, thus supports HDR. AV1 of course being the seemingly winning, de facto new codec for high-res/HDR video that everyone has already committed to switching to. And while I personally think JPEGXL is a better choice for a lot of technical reasons, as two of the three biggest browsers on the internet are going with AVIF almost simultaneously that's pretty much it as far as what the next standard will actually be.

Still, it means anyone looking to do HDR screenshots, well in the future when games/systems support AVIF screenshots, will find it easy, and it'll work on all OS's and browsers. RIP HEIF and it's lack of support on browsers.
 
Back
Top