HDR is back in a big way. Lots of stuff is coming out that makes a big deal out of it (including the consoles now). But what does it really mean?
Personally, I don't quite understand why TVs should know anything more than to be able to understand a 12bit per color video source and map that to their full contrast range. Everything else that gives hints about the source material's 'HDR' surely is just adding information back that has at some point gone missing in the source material? So I could understand if you add HDR information for a certain movie that was encoded with 8bit precistion, but otherwise I don't see any advantages. Meaning that if a GPU can do 10-12 bit color and output that, games should be fine and do what they've always done?
Can anyone 'enlighten' me?
Personally, I don't quite understand why TVs should know anything more than to be able to understand a 12bit per color video source and map that to their full contrast range. Everything else that gives hints about the source material's 'HDR' surely is just adding information back that has at some point gone missing in the source material? So I could understand if you add HDR information for a certain movie that was encoded with 8bit precistion, but otherwise I don't see any advantages. Meaning that if a GPU can do 10-12 bit color and output that, games should be fine and do what they've always done?
Can anyone 'enlighten' me?