128-bit HDR vs 64-bit HDR

Well, as I said, though, a "real-value" display (no tonemapping) would require very specific lighting conditions to look good. So I don't think it'd be practical for gaming.
 
corysama said:
Well... If nothing else, you could look at the files sizes. I don't know if the EXR formats use any sort of lossless compression internally, but if they don't then the number of bytes in the file is going to be either 6 or 12 times the number of pixels in the image.

If it is significantly more than 6 bytes per pixel then it is probably a 32 bit per channel image.
The compression method is stored as an ordinal in the file so no real easy way to tell how the file is encoded. OpenEXR does support lossless and lossy compression. Whom ever posts to ILM's mailing list says they use 16bpc stored in the PIZ format internally. That is a lossless, huffman+wavelet+zip format. Obviously that requires a bit of processor time to encode/decode but does do a pretty good job of shrinking the file.
 
OpenEXR (and the half, fp16 format) was originally designed as an archive format for film sequences.

s10e5 was in use by graphics hardware companies as a computational format independently of ILM choosing it for film storage.
 
corysama said:
Most monitors have a contrast ratio somewhere in the range of 400-1000 to 1.

Maybe someday we won't even have to tone map down to LDR at all!
http://www.brightsidetech.com/
I've seen these in action. I want. I want badly.

Most monitors claim to have around 400. This isn't achievable under optimal coniditons. They stick it in a bright room, turn the contrast all the way up, turn the brightness all the way up, then measure it. They stick it in a dark room, turn the contrast all the way down, turn the brightness all the way down, then measure it. The 2 numbers aren't achievable at once. Furthemore, ambient lighting in even good viewing conditions, gets the usable contrast down to about 100:1.

Glad you like the BrightSide monitor. You see it at Siggraph this year, I assume?
 
Chalnoth said:
Well, as I said, though, a "real-value" display (no tonemapping) would require very specific lighting conditions to look good. So I don't think it'd be practical for gaming.

Yeh. For games that'd be fine. It was just my point that you could lose detail in extreme cases of change between frames. That'd be unacceptable for some anal scientific applications, but perfectly fine for games.
 
db said:
s10e5 was in use by graphics hardware companies as a computational format independently of ILM choosing it for film storage.
Actually, looking back at my notes, we're both right. At a session I attended with ILM and nVidia, they said that they had been working on variants of the same thing and opted to rectify the differences.
 
squarewithin said:
Yeh. For games that'd be fine. It was just my point that you could lose detail in extreme cases of change between frames. That'd be unacceptable for some anal scientific applications, but perfectly fine for games.
I'm not sure what you mean. Did you quote the right thing?
 
squarewithin said:
Glad you like the BrightSide monitor. You see it at Siggraph this year, I assume?

I missed Siggraph this year, so to be honest I've technically never seen "BrightSide". I have seen each of SunnyBrook's presentations over the past several years.

How many more years am I going to have to wait to get one of those portable windows on my desk? I don't care that there is no software to support it -I'll write my own! I just want one to experiment on.
 
Back
Top