Sony PS3 will be the first device to utilise newly-approved HDMI 1.3

Ok well that's all good... as long as there's an understanding that I wasn't refering to HDR capable displays when I was advising to wait for xvYCC-compliant sets should anyone be considering a $10,000 TV expenditure in the next couple of months. :cool:
 
MfA said:
It's not that bad, more like 2 microwaves or a vacuum cleaner.
Well, my vacuum's rated at 1900W, and when it's been going for less than than 30 seconds I can clearly feel the room temperature rising. After a couple minutes, it's several degrees hotter.

I can't imagine what it must be like after a 3.5-hour Lord of the Rings viewing (using the volcanic-like 360 as the playback device naturally :)), nor can I the electricity bill. :p
 
This means that future HDMI 1.3 displays will be able to show billions of colours, rather than millions.

Weeee! I was getting bored of the same old greens, yellows and purples. I wonder what the new ones will be called? Any guesses? Limson, blium, prillo?
 
MfA said:
The op chose to focus on the HDR part of the standard, rather than the potential of using other color spaces ... the first 4 posts all had the term in them, so I wouldn't say it turned into anything.

Surely HDMI1.3 simply offers a higher precision over the same low-dynamic range as everything else, rather than actually allow HDR?

And in response to shifty - while in many cases 24bit colour is "fine", it's certainly not banding free - even on a good analog display. Maybe this is another one of these things like 30/60 fps that many people either can't see, or think they can't see. The human eye is actually a lot better, and has a lot more tricks up its sleeve, than most people give it credit for.

And if you bear in mind that many modern displays are purely digital, and also that they manipulate the image before display, then the banding is going to be exaggerated.

The other things you want improved are mostly applicable to recorded/broadcast content where bandwidth or capacity are the main problem. For recorded stuff, blu-ray would actually fix most of them. HD-DVD, probably not. Broadcast is going to take a while to catch up. Realtime content on the other hand, is not going to get compressed and is probably already rendered at better than 8-bit in many cases. So a better display could benefit games right now.
 
Shifty Geezer said:
I'd go with 24 bit 1080p (maybe SED or some other superior tech), 60 fps, very high quality compression first and foremost on massive storage if necessary, and only fiddle around with HDR once all that's been sorted out.

Hi All,

I really don't know about games, but you will never see a 24bit color encoding with hd-dvd or blue ray, all the consumer movies are encoded at no more than 8bit.

No one of we shoot at more than 10bit.

For what matter only recently with the introduction of the new hdcamsr sony was able to give us a camera capable of 10bit (all the previuos hdcam model are only 8bit ), and no one will scan at more than 10bitlog from film in the foreseeable future.


Bye,
Ventresca.
 
Ventresca said:
Hi All,

I really don't know about games, but you will never see a 24bit color encoding with hd-dvd or blue ray, all the consumer movies are encoded at no more than 8bit.

No one of we shoot at more than 10bit.

For what matter only recently with the introduction of the new hdcamsr sony was able to give us a camera capable of 10bit (all the previuos hdcam model are only 8bit ), and no one will scan at more than 10bitlog from film in the foreseeable future.


Bye,
Ventresca.

Isn't that per component though? So when you read 24-bit here, I think you may be confusing it as per component (that'd be insane) vs the 8-bit per component it refers to.
 
xbdestroya said:
Isn't that per component though? So when you read 24-bit here, I think you may be confusing it as per component (that'd be insane) vs the 8-bit per component it refers to.

Hi xboxdestroya, you are right and i apologize for the reading comprension error, i thought he was saying he will wait for a sed display to enjoy 24(72bit!!!) bit color reproduction.


Bye,
Ventresca.
 
Nope. That'd be kinda hypocritical of me when the rest of the thread I'm moaning about extended colour ranges not providing much benefit and saying I think we'd be better off with other image quality improvements in our displays ;)
 
Who said SED will display 72-bit colour? If anything, i would think it can display as much colour as a CRT can display, seen how the technology is based on the same principles...
Is there an article somewhere, cause that's news to me...
 
Ventresca said:
Hi xboxdestroya, you are right and i apologize for the reading comprension error, i thought he was saying he will wait for a sed display to enjoy 24(72bit!!!) bit color reproduction.


Bye,
Ventresca.

Hey no problem. :)

If you're in the industry - and it seems you are - then I could certainly understand where you would think in terms of 'per component' usually. No doubt your insights will be much welcomed during the occasional A/V debates that break out here.
 
OT: Now that i have my new Nokia N80 with wi-fi, i can post even without being in front of a PC, which means i will post even more than i do now! Aren't u guys happy!!?
 
london-boy said:
OT: Now that i have my new Nokia N80 with wi-fi, i can post even without being in front of a PC, which means i will post even more than i do now! Aren't u guys happy!!?

:oops:
 
Last edited by a moderator:
MrWibble said:
Surely HDMI1.3 simply offers a higher precision over the same low-dynamic range as everything else, rather than actually allow HDR?
The dynamic range is set by the display, the precision is all that matters ... it allows 16 bit per component, which is enough for HDR with the right colorspace.
 
Last edited by a moderator:
Mfa said:
Reflected sunlight from white surfaces has far higher brightness than these displays.
That isn't saying much - reflected sunlight from white surfaces can damage your eyes even without looking at it directly - try skiing a few hours with no sunglasses if you want to experiment.
Thanks for the info though - do you have actual numbers for brightness levesl these displays can reach?
 
RobertR1 said:
woohoo! billions of colors that the human eye can't even see! Thank you HDMI 1.3

My snide remark was in relation to A/V specs and standards changing stupid amount. It's always "this will be the greatest thing ever!!!" followed shortly by the next greatest thing ever.
 
Let us remember that the biggest bottleneck to getting the full benefit of "mere" 24-bit color is high compression codecs used on digital video (essentially, includes every widely used source of video to the consumer). The banding you see on sd or even hd program material is not a result of the 24-bit color set, but a component of the compression applied to the video in order to get it to you, in the first place. If anybody has ever wondered how digital cable/satellite/ota/optical disc can look as bad as 16-bit color, it's probably because that is the effective bit depth once it has been through compression. Bummer, right? So worrying about something like 30-bit color is not really going to produce better results until we can figure out how to broadcast/transmit/store it at significantly higher bitrates (less compression).
 
Fafalada said:
That isn't saying much - reflected sunlight from white surfaces can damage your eyes even without looking at it directly - try skiing a few hours with no sunglasses if you want to experiment.
There is just a tad more energy in the UV spectrum there than what you get from a display.
Thanks for the info though - do you have actual numbers for brightness levesl these displays can reach?
BrightSide has some specs on their displays on site (one of them also mentioned that X-Ray light boxes are rated at something like 4000 cd/m2, which is more than these displays).
 
london-boy said:
OT: Now that i have my new Nokia N80 with wi-fi, i can post even without being in front of a PC, which means i will post even more than i do now! Aren't u guys happy!!?

Weren't you worried about being able to afford the PS3 or keeping up with gaming in general?

And now, you have a phone which is over $600 in the US and are looking at Bravias?:LOL:

BTW, how do you type anything of length without a QWERTY keyboard on that thing?
 
Guden Oden said:
SACD is - AFAIK - simply standard stereo with no greater in bit resolution than what ordinary SPDIF can support. I don't see why you'd need HDMI 1.3 specifically to pipe it over to your reciever/surround decoder, there should be enough room in the standard we already have to accomplish that.

Besides, the constant DRM references in regards to Sony is really tiring, and smacks of fannishness. This is a technical forum, try keeping it that way...

If you mean 'stereo' in that SACD is only 2 channel, its not. Most SACD discs have a 5.1 mix in them as well and i believe SPDIF would not have the bandwidth for those. Bandwidth aside, our implementation of SPDIF doesnt provide the digital content protection that SOny has been requiring for DSD.

randycat99 said:
Let us remember that the biggest bottleneck to getting the full benefit of "mere" 24-bit color is high compression codecs used on digital video (essentially, includes every widely used source of video to the consumer). The banding you see on sd or even hd program material is not a result of the 24-bit color set, but a component of the compression applied to the video in order to get it to you, in the first place. If anybody has ever wondered how digital cable/satellite/ota/optical disc can look as bad as 16-bit color, it's probably because that is the effective bit depth once it has been through compression. Bummer, right? So worrying about something like 30-bit color is not really going to produce better results until we can figure out how to broadcast/transmit/store it at significantly higher bitrates (less compression).

I'm not saying youre wrong, but i've had experience that says 24 vs 30 may be at least relevant. When calibrating my display using a DVD there have been some test patterns with gradients that show a clear deliniation between the different shades of gray when processed at 8 bits per channel. However when i upgraded my video processor (via firmware) to 10 bits per channel, the gradients were MUCH smoother. I understand theres a lot in the chain here but all other things being equal, 10 bit (30bit) seemed to help.
 
expletive said:
I'm not saying youre wrong, but i've had experience that says 24 vs 30 may be at least relevant. When calibrating my display using a DVD there have been some test patterns with gradients that show a clear deliniation between the different shades of gray when processed at 8 bits per channel. However when i upgraded my video processor (via firmware) to 10 bits per channel, the gradients were MUCH smoother. I understand theres a lot in the chain here but all other things being equal, 10 bit (30bit) seemed to help.

8 bits per color component is certainly not enough to do high quality *processing* of the image. This is because to process the image you want the values to be in a linear space so you easily can do stuff like take the average of two colors for blending and the like. With linear intensity, 8 bits is not enough and you get banding in dark areas but you can also get other errors.

8bits is probably good enough to store and transmit images though, asuming you use those bits in gamma space.

If DVI/HDMI postulates that RGB values are supposed to be linear, then 8 bits per component is not enough and more bits are needed. If they are in gamma space, it's probably overkill as long as any processing of the image happens at better precision.
 
Back
Top