More RSX tidbits:

Status
Not open for further replies.
Uhm nope. The physical screen in LCDs is 8-bit per channel, as in the physical liquid crystals are only capable of that much. Last time i checked of course. Might have gone up a bit, i heard it would eventually, to 12bit, but i'm not sure it has happened yet?
Numerous times, in order to get the response times lower (to avoid motion blur LCDs have problems with) they use 6-bit panels, which is silly cause the sets still have motion blur, but this time with even less colours displayed.


So you are saying that these guys with a 360 a play a game with FP10 (aka 40bit HDR) are only getting 8bits of that HDR? Confused I am.
 
Panels vary in bit depths. You'll probably find that many of the panels on laptops are as few as 6-bit per channel. However, there very high end panels that go up to 16-bits per channel; Samsungs higher end of LCD HDTV's are 10-bit.

Edit - When talking about this in relation to HDR then its fairly immaterial as there is a pass that takes it down to 8-bit anyway. Technically, under Vista and on ATI's Avivo pipeline it would be possible to do a tone mapping pass directly in the video pipeline in order to get a 10-bit output for panels than can accept it.
 
Panels vary in bit depths. You'll probably find that many of the panels on laptops are as few as 6-bit per channel. However, there very high end panels that go up to 16-bits per channel; Samsungs higher end of LCD HDTV's are 10-bit.


Yes the mans is back. :smile:

And yes again to me getting a new high end Samsung LCD TV this month. I will get to see better HDR then you guys. Naaa na na na naa naaaaaaaa.:p
 
also contrast ratio... most high end displays show a MAX of 7000:1 contrast ratio...(XBR3) your eye can notice contrast WELL in excess of that level...in the hundreds of thousands...

The LED backlight displays will allow that to jump to 10000:1 by next year... still a far cry away from a real HDR dsiplay...where you get contrast ratios of 100000:1
 
The key word here is tone mapping.

You won't see "better HDR" on your Samsung HDTV, you'll probably just see more accurate colours. That's as long as the inner processing of the TV doesn't mess the signal up too much, which i won't comment on at the moment, but needless to say, seeing their previous efforts, could be better than other brands.
 
From a PS3 or 360 you won't see more "shades of colour" than 8-bit regardless of the panel as I doubt either of them have a greater thjan 8-bit video pipeline.
 
also contrast ratio... most high end displays show a MAX of 7000:1 contrast ratio...(XBR3) your eye can notice contrast WELL in excess of that level...in the hundreds of thousands...

The LED backlight displays will allow that to jump to 10000:1 by next year... still a far cry away from a real HDR dsiplay...where you get contrast ratios of 100000:1

I should add to that, LED backlight (like the new Samsungs) won't do that as it's just like a normal LCD display, only with LED in place of the normal tube lights - LEDs are always on, meaning the blacks still won't be pure black cause some light will come through and obviously contrast ratio will suffer.

Only Dynamic LED Backlight (or whatever the the tech showed by those Brightside guys, those HDR displays we have talked about on here) will achieve those amazing contrast ratios and black levels, because the LED array will turn off in areas of black, giving you a perfect black.

The new LED backlight Samsung don't have dynamics LED backlights, only a uniform backlight (like all other LCDs) which just happens to be white LEDs instead of the normal LCD backight tubes i've been getting all these years.
 
From a PS3 or 360 you won't see more "shades of colour" than 8-bit regardless of the panel as I doubt either of them have a greater thjan 8-bit video pipeline.

True, all these silly 128-bit of colur will always be converted to manageable RGB colour at much lower precision to output to TVs.

As i said, mckmas, all these very high precision figures only help with banding. In the end they provide more precise approximation to the end result (which is the 8-bit RGB signal or whatever it is). The higher the precision of rendering, the more accurate the output colours are and the less banding you get. Problem is that after a certain point we just won't be seeing the difference on our TVs.
 
So what's the point of HDMI 1.3 in the PS3?

Not much, more of a future proofing and yet another bullet point to brag about?

HDTVs capable of displaying what HDMI1.3 can carry (is it 48-bit? can't remember) won't come out for a while and they will be the highest end of HDTVs for a very long time.

Mainly i think it's for Bluray than for games.

I mean, this isn't even 1080p levels of acceptance anymore. In comparison 1080p is very old news and has a huge installed userbase. That's how important HDMI1.3 is.
 
40bit HDR = fp10 <- special case for xenos
Incorrect. fp10 is 10+10+10+2 bits (2 bit alpha).

Dave B said:
From a PS3 or 360 you won't see more "shades of colour" than 8-bit regardless of the panel as I doubt either of them have a greater thjan 8-bit video pipeline.
Given Sony were trumpetting PS3's HDMI 1.3 high colour out, what's the possiblity of 14 bit (or whatever it is) per channel output being possible, once sets become available that support it? Given the other high-display-tech pursuits, I wouldn't be surprised.
 
Given Sony were trumpetting PS3's HDMI 1.3 high colour out, what's the possiblity of 14 bit (or whatever it is) per channel output being possible, once sets become available that support it? Given the other high-display-tech pursuits, I wouldn't be surprised.

I guess we'll start worrying about that when we actually get some TVs able to take that signal and display it properly... :D
 
I guess we'll start worrying about that when we actually get some TVs able to take that signal and display it properly... :D
The Japanese model of Bravia X2500 that supports xvYCC color space and (partial) HDMI 1.3 is already out but over 40-inch @ $3,700.
http://www.cinenow.com/uk/news-2150.html
http://www.sony.jp/CorporateCruise/Press/200608/06-0830B/
http://www.watch.impress.co.jp/av/docs/20060830/sony2.htm

http://www.hdmi.org/resourcecenter/hdmi_1_3_faq.asp
Q: What is “xvYCC”?
HDMI 1.3 adopts use of the IEC 61966-2-4 color standard, commonly called xvYCC (shorthand for Extended YCC Colorimetry for Video Applications). This new standard can support 1.8 times as many colors as existing HDTV signals. xvYCC lets HDTVs display colors more accurately, enabling displays with more natural, vivid colors .

Q: What is the difference between “Deep Color” and “xvYCC?”
Deep Color increases the number of available colors within the boundaries defined by the RGB or YCbCr color space, while xvYCC expands the available range (limits) to allow the display of colors that meet and exceed what human eyes can recognize.

Q: When will products with HDMI 1.3 capabilities be available to the public?
Products using HDMI 1.3 capabilities are expected to become available this year starting with the PS3. Displays, DVDs and A/V Receivers are expected to ship early in 2007.

Lets not forget. How does this relate to the RSX?
It's RSX that has an HDMI interface.
 
Well some of the work by Barry Minor of IBM utilizes 128bit HDR.

Comment by Barry Minor — December 2, 2005 @ 12:21 pm
Juice,

I used a UP Cell bringup system.
3.2 GHz DD3.1 Cell processor with 8 good SPEs, 512 MB XDR Memory, 100 Mb network.
Pixels were 128 bit, 32 bit float per color channel.
All rendering parameters were the defaults set by the Cg program with the exception of the window size which was increased to 1024×1024.

http://gametomorrow.com/blog/index.php/2005/11/30/gpus-vs-cell/
 
Q: What is “xvYCCâ€￾?
HDMI 1.3 adopts use of the IEC 61966-2-4 color standard, commonly called xvYCC (shorthand for Extended YCC Colorimetry for Video Applications). This new standard can support 1.8 times as many colors as existing HDTV signals. xvYCC lets HDTVs display colors more accurately, enabling displays with more natural, vivid colors .

Q: What is the difference between “Deep Colorâ€￾ and “xvYCC?â€￾
Deep Color increases the number of available colors within the boundaries defined by the RGB or YCbCr color space, while xvYCC expands the available range (limits) to allow the display of colors that meet and exceed what human eyes can recognize.
1.8 times as manay colours means not even 9 bits per channel! Any idea what they really mean by this? Are they extending the colour gamut to accomodate more hues? For example, oranges in the TV space are pretty rubbish, nothing like the intensity of real, natural orange colours (or artificial pigments even). Green and red combined just don't cut it. Seems to me TV's would benefit from a fourth or fifth colour component too, like photo printers or Fujifilms fourth colour layer. I don't know if that's something intended for this xvYCC, but it would benefit from better colour representation I think if people want to go that way.

Having said that, HDTVs look pretty awesome in their renderings from what little I've seen, so it'd hardly something the general populace would be clamouring for!
 
Having said that, HDTVs look pretty awesome in their renderings from what little I've seen, so it'd hardly something the general populace would be clamouring for!
Well in this (somewhat exaggerated) demo they say Mach banding is reduced by it.
http://pc.watch.impress.co.jp/docs/2006/0714/hdmi.htm
hdmi_3.jpg


Not convinded that it does, as my understanding is that HDMI is produced via a Silicon Image chip.
Oh I remember that discussion :smile: Then SI chip in RSX! But in an old demo at CES it said RSX does video processing, which I think suggests it does something to data to be fed into that chip when playing movies.
 
Status
Not open for further replies.
Back
Top