PS3 Firmware 2.30

No bitstreaming due to the HDMI chip used. However with TruHD and now DTS HD MA both internally decoded and sent out via PCM, who cares!? Good job Sony on making the PS3 even more of a kick ass BR player!
 
No bitstreaming due to the HDMI chip used. However with TruHD and now DTS HD MA both internally decoded and sent out via PCM, who cares!? Good job Sony on making the PS3 even more of a kick ass BR player!

Isn't PCM restricted to 16 bit? or does it support 24 bit over HDMI?
 
According to wikipedia, HDMI 1.3 supports up to 8-channel, 192kHz, 24 bits per sample audio: http://en.wikipedia.org/wiki/HDMI

This Blu-ray thread claims that PS3 can support the following:
2.0 16 (bits)/48 (kHz) LPCM = 1.5 Mbps (supported by optical)
5.1 16/48 LPCM = 4.6 Mbps (HDMI only)
5.1 24/48 LPCM = 6.9 Mbps (HDMI only)
7.1 24/48 LPCM = 9.2 Mbps (HDMI only)
7.1 24/96 LPCM = 18.4 Mbps (HDMI only)
 
This update will prove SI 9132, SI 9133 performance about DTS-HD output.

Year ago, most of AV forum claims SI 9132, SI 9133 doesn't support DTS-HD output.
So they concluded that PS3 didn't support DTS-HD (both HD Master and HD High Resolution)output.

Now. It'll clear PS3 supported all DTS-HD output.
 
The audio will be decoded internally on the PS3 - something we've known for a while it had the horsepower to do. The audio will be output as PCM though, just as all lossless audio over the HDMI is output as on PS3.
 
Last edited by a moderator:
This update will prove SI 9132, SI 9133 performance about DTS-HD output.

Year ago, most of AV forum claims SI 9132, SI 9133 doesn't support DTS-HD output.
So they concluded that PS3 didn't support DTS-HD (both HD Master and HD High Resolution)output.

Now. It'll clear PS3 supported all DTS-HD output.

There were different opinions on this, from what i have read and learned the only justified guess would be that Bitstreaming of anything beyond standard DTS+AC3 isn´t possible. Maybe someone can provide a Linux proof :)

But just getting DTS-MA removes much of the need for Advanced Codec bitstreaming, and silences a lot of non believers from the HD format war.
 
This update will prove SI 9132, SI 9133 performance about DTS-HD output.

Year ago, most of AV forum claims SI 9132, SI 9133 doesn't support DTS-HD output.
So they concluded that PS3 didn't support DTS-HD (both HD Master and HD High Resolution)output.

Now. It'll clear PS3 supported all DTS-HD output.

I'm not so sure, from what i remember the informed posters were saying that the above chips were not capable of outputing DTS-MA, and this has been all but confirmed by SI.

Hence any solution would have to rely on the PS3 decoding the DTS and passing it out as LPCM, which the chips can do, and is the solution provided by this firmware.

I do think there were some people who were sure that LPCM wouldn't sound anywhere near as good as if a swanky AV Amp had decoded it though.

Actually, thinking about it there was proabably hundreds of people from the HD-DVD forum saying the PS3 could never do DTS-MA :D But there were also hundreds of PS3 fans saying DTS-MA bitstream output would be comming. And in the middle there were a number of respected posters claiming what has transpired, and these were the people who had decent reasoning in their posts, and thus the ones I tended to take notice of.
 
I do think there were some people who were sure that LPCM wouldn't sound anywhere near as good as if a swanky AV Amp had decoded it though.

These folk are confused though. The PCM audio being sent out of the PS3 is the same lossless stream as the DTS-MA itself; the only difference is on what system it is decompressed. Once it reaches the receiver, any benefits that would be realized by the receiver 'touching' the audio stream will still take place, it's simply that the decoding step will be skipped on that platform. Turning the DTS-MA into PCM is something where the result can have no inherent variance across any platform - think of it simply as WinZip or something. What's done with the audio feed after that occurs is all folk should care about.
 
Well since I imagine you were originally responding to my own post before you wrote that Gradthawn, I undeleted it to prompt whatever discussion.

Yeah, I was. :p

These folk are confused though. The PCM audio being sent out of the PS3 is the same lossless stream as the DTS-MA itself; the only difference is on what system it is decompressed. Once it reaches the receiver, any benefits that would be realized by the receiver 'touching' the audio stream will still take place, it's simply that the decoding step will be skipped on that platform. Turning the DTS-MA into PCM is something where the result can have no inherent variance across any platform - think of it simply as WinZip or something. What's done with the audio feed after that occurs is all folk should care about.

The variance comes in, based on reports from owners able to test it, with how the receivers handle a PCM vs bit-streamed signal. There are users who report better results with PCM input with their receivers, those that report the exact opposite, and those that say the only difference they can hear is a difference in audio levels (of PCM vs Bitstream). Reports seem pretty consistent across receivers, but there are some exceptions. This seems quite plausible to me. Reminds me of the 1080i vs 1080p debate. While a 1080p display receiving a 1080i signal should result in the same PQ as a 1080p signal, the reality is that not all TVs properly deinterlace 1080i signals, thus introducing various articficats and other faults in the PQ of a 1080i signal. I could see this same situation happening with receivers.
 
Last edited by a moderator:
Well since I imagine you were originally responding to my own post before you wrote that Gradthawn, I undeleted it to prompt whatever discussion.
 
Some people think that decoding hd-audio to lpcm and streaming lpcm introduces jitter and jitter makes sound quality worse than bitstreaming. I don't take stand on this as I don't understand the issue well enough. Though I cannot comprehend where the end user hearable jitter would actually come from.
 
Umm, well even if it was decoded in the receiver, wouldn't you still have this jitter?

Anyways, there will be players soon enough that will send the bitstreams to receivers.
 
The variance comes in, according to reports, with how the receivers handle a PCM vs bit-streamed signal. There are users who report better results with PCM input with their receivers, those that report the exact opposite, and those that say the only difference they can hear is a difference in audio levels (of PCM vs Bitstream). Reports seem pretty consistent across receivers, but there are some exceptions. This seems quite plausible to me. Reminds me of the 1080i vs 1080p debate. While a 1080p display receiving a 1080i signal should result in the same PQ as a 1080p signal, the reality is that not all TVs properly deinterlace 1080i signals, thus introducing various articficats and other faults in the PQ of a 1080i signal. I could see this same situation happening with receivers.

But to me that's a receiver issue entirely. Granted that's not saying that given a certain receiver, I wouldn't understand why its owner might prefer a bit-streamed signal, but at the same time I don't consider the inability to bit-stream out of the PS3 a 'failing' in any way, just as I don't consider the lack of IR a failing - these are simply use-case conveniences/perks. As the entirety of the decoding and streaming process for these lossless codecs on the PS3 remains within the digital realm, in my eyes the job is getting done and getting done as well as possible. And honestly, I'd probably prefer the PS3 to handle all decode operations given the choice, simply because as the 'center' of an arrangement, the PS3 is clearly the more dynamic/advanced component.
 
Umm, well even if it was decoded in the receiver, wouldn't you still have this jitter?

Anyways, there will be players soon enough that will send the bitstreams to receivers.

Here's a good explanation of jitter from Ron Jones over at AVS.

I agree that LPCM vs. bitstream sound quality has been over discussed. It is technically possible to deliver equal performance but some AVRs and preamp/processors don't. Passing LPCM via HDMI is known to introduce additional jitter (timing errors between bits) and many AVRs will pass these data timing errors onto the D-to-A converters and this can degrade sound quality. However, it is relatively easy to include a data buffer and reclocking of the LPCM data steam within the AVR and if this is done then the level of jitter need be no more than for the case where an encoded bitstream is being passed from the source device and the AVR in this case is forced to provide the clocking for the decoded bit string that it sends to the D-to-A converters. Thus there is no absolute answer possible as to which is better since it depends on the specific hardware implementation in the AVR or preamp/processor. My own background is 30+ years as an electronics engineer dealing with digital communications systems and digital signal processing.

But to me that's a receiver issue entirely. Granted that's not saying that given a certain receiver, I wouldn't understand why its owner might prefer a bit-streamed signal, but at the same time I don't consider the inability to bit-stream out of the PS3 a 'failing' in any way, just as I don't consider the lack of IR a failing - these are simply use-case conveniences/perks. As the entirety of the decoding and streaming process for these lossless codecs on the PS3 remains within the digital realm, in my eyes the job is getting done and getting done as well as possible. And honestly, I'd probably prefer the PS3 to handle all decode operations given the choice, simply because as the 'center' of an arrangement, the PS3 is clearly the more dynamic/advanced component.

Ah, I see your point now. I think I mis-understood your original post. What I was talking about pretty much is entirely receiver issues/variances while it seems you were referring to the technical merits of bitsreaming vs internal player decoding independent of variables on the receiver's end.
 
Last edited by a moderator:
I would really like to see explanation of the user hearable jitter.... it would have to be in tens of milliseconds range and how does that happen in reality?

edit. Why doesn't that jitter occur with decoded picture that has much higher bandwidth requirements?

edit2: How would jitter happen if client has nice buffer to fight against lag. all jitter would be null&void before actually occuring
 
Last edited by a moderator:
People who want the ultimate fidelity will eventually opt for a high-end player which can pass the bitstream to some exotic hardware which will decode and convert the streams at a premium price.

PS3 is going to be good enough for the mass market.
 
People who want the ultimate fidelity will eventually opt for a high-end player which can pass the bitstream to some exotic hardware which will decode and convert the streams at a premium price.

PS3 is going to be good enough for the mass market.

Sarcasm is difficult, but I would assume blind tests would prove such systems to be only waste of money.

There are threads in avsforum which try to prove dd 640kbps is as good as original sound and it's not possible to separate dd from original sound with usual home theater setup. I hope they don't have jitter in that dd setup ;D lol.
 
Last edited by a moderator:
Back
Top