Sony PS3 will be the first device to utilise newly-approved HDMI 1.3

zed

Legend
Sony's Playstation 3 will be the first device to support HDMI 1.3, which will offer more than twice the bandwidth of HDMI 1.1 (from 4.95Gbps to 10.2Gbps).

This means that future HDMI 1.3 displays will be able to show billions of colours, rather than millions.

Audio support has also been increased to lossless formats like Dolby TrueHD and DTS-HD Master Audio, rather than the current compressed formats of Dolby Digital and DTS.

Consumers who have already invested in HD technology will be pleased to note that products that implement HDMI 1.3 will be backward compatible with HDMI 1.1.
http://www.pocket-lint.co.uk/news.php?newsId=3735
so that we can capture a scene in all its color and luminance-range richness. Unfortunately, current low-cost display technologies (common LCDs, CRTs, etc..) can't properly display an HDR image, so we need to go through a process called 'tone mapping' to remap our image to an LDR (Low Dynamic Range) image in order to display it on a screen.
http://psinext.e-mpire.com/index.ph...99&PHPSESSID=e3ce3f9dbf670442f485ee7b87de6f88

great stuff, one thing thats bugged me for years is on the screen white aint white + the lack of color definition.
question how available are these displays today, + to ppl that have seen them in action side by side with a normal hdtv do they look much more vibrant?
 
This thing has puzzled me a while, lets assume SONY makes a new screen this fall, and sells it, with HDMI 1.3 input. Is it good enough to simpy skip tone mapping in games as heavenly sword, and display HDR in native format?
Maybe some here could help clarify such ideas? :p
 
I've seen info for one HDR display a few months ago - and IIRC the energy consumption of a single device is comparable to an appartment block (or something stupidly high like that).
And when you think about it - true HDR display also has a bunch of safety issues (brightness levels that can actually damage viewer's eyes), it's a good question if consumer device with such tech will ever be possible.
 
Fafalada said:
I've seen info for one HDR display a few months ago - and IIRC the energy consumption of a single device is comparable to an appartment block (or something stupidly high like that).
And when you think about it - true HDR display also has a bunch of safety issues (brightness levels that can actually damage viewer's eyes), it's a good question if consumer device with such tech will ever be possible.

Almost certainly not. However as with tone-mapping, it's possible to use much lower ranges of brightness to fool the eye. HDR displays probably do need to be able to output very bright pixels, but not so bright as to be actually blinding. What is needed though is a display that can not only output those bright pixels, but also do a true black pixel (no grey, and no accidential bleeding from nearby bright pixels) and accurately do thousands of levels in between... And I don't want to see any of these nasty prototype HDR displays that work by sticking an LCD panel in front of a bunch of addressable white LEDs... not until the LEDs are pixel-sized anyway...

I would also suggest that perhaps the tone-mapping stage would move to the TV rather than the output device, allowing the TV to map appropriately to the current environment, display capabilities, and/or user-preference.
 
What's even funnier is that the new Sony X-Series Bravia HDTVs, which are the highest range of the high range, don't even support HDMI1.3... they're fully capable of taking 1080p through the 2 HDMI ports, and they are seriously amazing sets, but still no HDMI1.3...
 
I have to say, I have no problem with TVs as they are. They do a pretty good approximation. There aren't many scenes I can look at with my eye that can't be represented realistically in 24 bit RGB. The problem with rendering them is in the capture devices that are higher contrast.
HDR TVs would let the user be able to stop and look at a contrasty scene and stare into the darkness to try to see some extra detail, but the overall experience won't be much different from LDR other than the ability to blind the viewer. Moving images won't permit the eye a great deal of time to adapt either. You'd just shift the exposure of the scene from the cameraman to the viewer, which also elliminates artistic use of exposure. If they could produce an HD (1080p) set with a full spectrum of colours and brightness, so black is pure black, that'll look as good as anyone will notice I expect. HDR cameras with tone-mapping down to 24 bit space would probably be the ideal solution, allowing the photograph to set exposure to a wider gamut than existing film, map it onto normal TVs so everyone benefits, and keeps the artistic qualities with the photographer.
I also wonder if any PS3 will have it's HDMI1.3 HDR ever get used for anything. It seems the most ill-thought-out feature of them all. The ability to output a signal virtual no-one will be able to use, with virtually no sources of HDR content. 1080p was pushing it! Or are Sony expected to make 1080p HDR games now too, for the 3 people rich enough to have an HDR set?
 
The first time I read HDR being used in relation to HDMI 1.3 was in this thread..;) The idea isn't so much about supporting HDR displays, but taking the colour range beyond what can be percieved by the human eye to eliminate colour banding etc (although maybe that's all the same thing!). Though I don't know what you get out of that with a regular TV, presumably there is some benefit..at least you know your machine that's outputting the signal is not the weak link as far as that is concerned.

Also, Shifty, HDMI 1.3 brings more than just more colour precision if you're pondering the point of including it. I guess there's a little bit of "why not?" there too - if the new standard is available they may aswell include it.
 
It depends if it costs more. What will HDMI 1.3 enable in PS3 that HDMI 1.1 (or whatever they're on) can't, and does it warrant the extra dollars that I presume it costs?
 
Well fairly significantly it now supports a number of new audio formats, stuff like 1080p/60 also I believe. I don't know how much more it'll cost, but including it ought to make PS3 that bit more attractive still for AV types etc. As far as I see it, the price is set now for launch, so including the best available av output isn't a bad thing.
 
Shifty Geezer said:
It depends if it costs more. What will HDMI 1.3 enable in PS3 that HDMI 1.1 (or whatever they're on) can't, and does it warrant the extra dollars that I presume it costs?

Well for one thing HDMI 1.3 should allow SACD (or as Sony brands it, DSD) over the one cable. There is no technical reason why HDMI 1.1 can not support SACD (or 1080p either) it's just Sony being Sony.

Beyond the technical specs of HDMI 1.3, methinks the real reason in the increased DRM capabilities.
 
SACD is - AFAIK - simply standard stereo with no greater in bit resolution than what ordinary SPDIF can support. I don't see why you'd need HDMI 1.3 specifically to pipe it over to your reciever/surround decoder, there should be enough room in the standard we already have to accomplish that.

Besides, the constant DRM references in regards to Sony is really tiring, and smacks of fannishness. This is a technical forum, try keeping it that way...
 
londonboy said:
What's even funnier is that the new Sony X-Series Bravia HDTVs, which are the highest range of the high range, don't even support HDMI1.3... they're fully capable of taking 1080p through the 2 HDMI ports, and they are seriously amazing sets, but still no HDMI1.3...

Not really that big of a deal, unless for some reason you need to pump DTS-HD or Dolby TruHD into your TV... Otherwise it's more critical for your output device (PS3, BD Deck, HD-DVD Deck, etc..) to support it, and for your audio decoder (which would likely be in your receiver) support it.

sumdumyunguy said:
Well for one thing HDMI 1.3 should allow SACD (or as Sony brands it, DSD) over the one cable. There is no technical reason why HDMI 1.1 can not support SACD (or 1080p either) it's just Sony being Sony.

DSD is the method that audio is digitized (as opposed to PCM) for SACD, it's not a "Sony Brand". Most receivers simply use Firewire to either receive pure DSD streams into the audio receiver, or receive the DSD stream converted to 88KHz PCM...

Guden Oden said:
SACD is - AFAIK - simply standard stereo with no greater in bit resolution than what ordinary SPDIF can support. I don't see why you'd need HDMI 1.3 specifically to pipe it over to your reciever/surround decoder, there should be enough room in the standard we already have to accomplish that.

SPDIF truncates too much of the signal. Most players that push SACD audio over SPDIF, convert the DSD stream to 88KHz PCM and rate-convert down to 44KHz (16 or 20-bit). And this is only for stereo. For Multi-Channel SACD you either need to use the multi-channel analog interconnects, or have a player and receiver that support FireWire w/DTCP...

Also supposedly the BDP-S1000 should launch with HDMI 1.3 before the PS3 does...
 
Last edited by a moderator:
RobertR1 said:
woohoo! billions of colors that the human eye can't even see! Thank you HDMI 1.3

The point is to move the range beyond the human eye's detection so that you also take certain undesireable side effects of more limited range away too (like banding). When you extend the range beyond what the eye can discern, you remedy that.
 
Although thanks to compression artefacts they won't be got rid of! In fact 24 bit is very capable of displaying pics in fantastic quality without banding etc. if your display is calibrated properly and your source is high enough quality. I can't help but feel HDR display is tech for technology's sake. I'd go with 24 bit 1080p (maybe SED or some other superior tech), 60 fps, very high quality compression first and foremost on massive storage if necessary, and only fiddle around with HDR once all that's been sorted out.
 
Fafalada said:
I've seen info for one HDR display a few months ago - and IIRC the energy consumption of a single device is comparable to an appartment block (or something stupidly high like that).
It's not that bad, more like 2 microwaves or a vacuum cleaner.
And when you think about it - true HDR display also has a bunch of safety issues (brightness levels that can actually damage viewer's eyes)
Reflected sunlight from white surfaces has far higher brightness than these displays.
it's a good question if consumer device with such tech will ever be possible.
If you are willing to sit in an otherwise completely dark room a CRT can act as a HDR display ;)
 
Remember Sony had that 82-inch xvYCC-compliant set at CES this year, so I think definitely they have it in mind and will be pushing it come this winter. Folk on the high-end of the A/V set looking to make a significant TV purchase in the near future, may want to hold off a little (I imagien this winter) for compatible TVs...
 
That's not a HDR display.

Patents will make sure HDR displays will not take off anytime soon IMO.
 
No, it's not an HDR display... but it is the relevent 'real-world' beneficiary of HDMI 1.3.

How did this thread turn into discussion of HDR displays in the first place?
 
The op chose to focus on the HDR part of the standard, rather than the potential of using other color spaces ... the first 4 posts all had the term in them, so I wouldn't say it turned into anything.
 
Back
Top