Are the rules and guidelines for Grading Film the same as for HDR in video Games?
I think that probably yes, as in both situations you are trying to create feeling and connection from what is on the screen.
There has been a huge amount of research into doing good HDR in film, so it would be smart to apply similar rules to any Auto HDR, process.
And from what little i know of both, it seems like they are.
Perhaps the auto-HDR process is blowing out too many highlights, and making them too high.
Most film will have less than 1% of screen at Max brightness at any point in time.
I'd love to see a side by side comparison of the SDR and HDR gameplay, looking at the actual light values, not just a visual inspections.
But given how tricky it is to discuss and compare some of this stuff i think the above video was pretty good, And that Evil Boris guy seems to know his stuff?
Although i think that looking if a game uses ALL of the available range is not a good approach, and instead i would be looking at a few different things.
- Peak light values (nits) that a game hits in any situation ( not every frame should hit 100% )
- Take a couple of representative shots and compare amount of image that is in bottom 20% or range, middle 60%, and top 20%
- How much if any the contrast changes in a scene when comparing the SDR to HDR verisons of the images.
But hell I'm just a software engineer, albeit in Broadcast industry, and I used to work on a color grading application
Also while in Film we can depend that most uses have Pro level equipment, most consoles games end up being played on rather average TV's,
and accommodating for that probably has a whole lot of other concerns.