Well, my vacuum's rated at 1900W, and when it's been going for less than than 30 seconds I can clearly feel the room temperature rising. After a couple minutes, it's several degrees hotter.MfA said:It's not that bad, more like 2 microwaves or a vacuum cleaner.
This means that future HDMI 1.3 displays will be able to show billions of colours, rather than millions.
MfA said:The op chose to focus on the HDR part of the standard, rather than the potential of using other color spaces ... the first 4 posts all had the term in them, so I wouldn't say it turned into anything.
Shifty Geezer said:I'd go with 24 bit 1080p (maybe SED or some other superior tech), 60 fps, very high quality compression first and foremost on massive storage if necessary, and only fiddle around with HDR once all that's been sorted out.
Ventresca said:Hi All,
I really don't know about games, but you will never see a 24bit color encoding with hd-dvd or blue ray, all the consumer movies are encoded at no more than 8bit.
No one of we shoot at more than 10bit.
For what matter only recently with the introduction of the new hdcamsr sony was able to give us a camera capable of 10bit (all the previuos hdcam model are only 8bit ), and no one will scan at more than 10bitlog from film in the foreseeable future.
Bye,
Ventresca.
xbdestroya said:Isn't that per component though? So when you read 24-bit here, I think you may be confusing it as per component (that'd be insane) vs the 8-bit per component it refers to.
Ventresca said:Hi xboxdestroya, you are right and i apologize for the reading comprension error, i thought he was saying he will wait for a sed display to enjoy 24(72bit!!!) bit color reproduction.
Bye,
Ventresca.
london-boy said:OT: Now that i have my new Nokia N80 with wi-fi, i can post even without being in front of a PC, which means i will post even more than i do now! Aren't u guys happy!!?
The dynamic range is set by the display, the precision is all that matters ... it allows 16 bit per component, which is enough for HDR with the right colorspace.MrWibble said:Surely HDMI1.3 simply offers a higher precision over the same low-dynamic range as everything else, rather than actually allow HDR?
That isn't saying much - reflected sunlight from white surfaces can damage your eyes even without looking at it directly - try skiing a few hours with no sunglasses if you want to experiment.Mfa said:Reflected sunlight from white surfaces has far higher brightness than these displays.
RobertR1 said:woohoo! billions of colors that the human eye can't even see! Thank you HDMI 1.3
There is just a tad more energy in the UV spectrum there than what you get from a display.Fafalada said:That isn't saying much - reflected sunlight from white surfaces can damage your eyes even without looking at it directly - try skiing a few hours with no sunglasses if you want to experiment.
BrightSide has some specs on their displays on site (one of them also mentioned that X-Ray light boxes are rated at something like 4000 cd/m2, which is more than these displays).Thanks for the info though - do you have actual numbers for brightness levesl these displays can reach?
london-boy said:OT: Now that i have my new Nokia N80 with wi-fi, i can post even without being in front of a PC, which means i will post even more than i do now! Aren't u guys happy!!?
Guden Oden said:SACD is - AFAIK - simply standard stereo with no greater in bit resolution than what ordinary SPDIF can support. I don't see why you'd need HDMI 1.3 specifically to pipe it over to your reciever/surround decoder, there should be enough room in the standard we already have to accomplish that.
Besides, the constant DRM references in regards to Sony is really tiring, and smacks of fannishness. This is a technical forum, try keeping it that way...
randycat99 said:Let us remember that the biggest bottleneck to getting the full benefit of "mere" 24-bit color is high compression codecs used on digital video (essentially, includes every widely used source of video to the consumer). The banding you see on sd or even hd program material is not a result of the 24-bit color set, but a component of the compression applied to the video in order to get it to you, in the first place. If anybody has ever wondered how digital cable/satellite/ota/optical disc can look as bad as 16-bit color, it's probably because that is the effective bit depth once it has been through compression. Bummer, right? So worrying about something like 30-bit color is not really going to produce better results until we can figure out how to broadcast/transmit/store it at significantly higher bitrates (less compression).
expletive said:I'm not saying youre wrong, but i've had experience that says 24 vs 30 may be at least relevant. When calibrating my display using a DVD there have been some test patterns with gradients that show a clear deliniation between the different shades of gray when processed at 8 bits per channel. However when i upgraded my video processor (via firmware) to 10 bits per channel, the gradients were MUCH smoother. I understand theres a lot in the chain here but all other things being equal, 10 bit (30bit) seemed to help.