No bitstreaming due to the HDMI chip used. However with TruHD and now DTS HD MA both internally decoded and sent out via PCM, who cares!? Good job Sony on making the PS3 even more of a kick ass BR player!
Isn't PCM restricted to 16 bit? or does it support 24 bit over HDMI?
According to wikipedia, HDMI 1.3 supports up to 8-channel, 192kHz, 24 bits per sample audio: http://en.wikipedia.org/wiki/HDMI
This Blu-ray thread claims that PS3 can support the following:
2.0 16 (bits)/48 (kHz) LPCM = 1.5 Mbps (supported by optical)
5.1 16/48 LPCM = 4.6 Mbps (HDMI only)
5.1 24/48 LPCM = 6.9 Mbps (HDMI only)
7.1 24/48 LPCM = 9.2 Mbps (HDMI only)
7.1 24/96 LPCM = 18.4 Mbps (HDMI only)
This update will prove SI 9132, SI 9133 performance about DTS-HD output.
Year ago, most of AV forum claims SI 9132, SI 9133 doesn't support DTS-HD output.
So they concluded that PS3 didn't support DTS-HD (both HD Master and HD High Resolution)output.
Now. It'll clear PS3 supported all DTS-HD output.
This update will prove SI 9132, SI 9133 performance about DTS-HD output.
Year ago, most of AV forum claims SI 9132, SI 9133 doesn't support DTS-HD output.
So they concluded that PS3 didn't support DTS-HD (both HD Master and HD High Resolution)output.
Now. It'll clear PS3 supported all DTS-HD output.
I do think there were some people who were sure that LPCM wouldn't sound anywhere near as good as if a swanky AV Amp had decoded it though.
Well since I imagine you were originally responding to my own post before you wrote that Gradthawn, I undeleted it to prompt whatever discussion.
These folk are confused though. The PCM audio being sent out of the PS3 is the same lossless stream as the DTS-MA itself; the only difference is on what system it is decompressed. Once it reaches the receiver, any benefits that would be realized by the receiver 'touching' the audio stream will still take place, it's simply that the decoding step will be skipped on that platform. Turning the DTS-MA into PCM is something where the result can have no inherent variance across any platform - think of it simply as WinZip or something. What's done with the audio feed after that occurs is all folk should care about.
The variance comes in, according to reports, with how the receivers handle a PCM vs bit-streamed signal. There are users who report better results with PCM input with their receivers, those that report the exact opposite, and those that say the only difference they can hear is a difference in audio levels (of PCM vs Bitstream). Reports seem pretty consistent across receivers, but there are some exceptions. This seems quite plausible to me. Reminds me of the 1080i vs 1080p debate. While a 1080p display receiving a 1080i signal should result in the same PQ as a 1080p signal, the reality is that not all TVs properly deinterlace 1080i signals, thus introducing various articficats and other faults in the PQ of a 1080i signal. I could see this same situation happening with receivers.
Umm, well even if it was decoded in the receiver, wouldn't you still have this jitter?
Anyways, there will be players soon enough that will send the bitstreams to receivers.
I agree that LPCM vs. bitstream sound quality has been over discussed. It is technically possible to deliver equal performance but some AVRs and preamp/processors don't. Passing LPCM via HDMI is known to introduce additional jitter (timing errors between bits) and many AVRs will pass these data timing errors onto the D-to-A converters and this can degrade sound quality. However, it is relatively easy to include a data buffer and reclocking of the LPCM data steam within the AVR and if this is done then the level of jitter need be no more than for the case where an encoded bitstream is being passed from the source device and the AVR in this case is forced to provide the clocking for the decoded bit string that it sends to the D-to-A converters. Thus there is no absolute answer possible as to which is better since it depends on the specific hardware implementation in the AVR or preamp/processor. My own background is 30+ years as an electronics engineer dealing with digital communications systems and digital signal processing.
But to me that's a receiver issue entirely. Granted that's not saying that given a certain receiver, I wouldn't understand why its owner might prefer a bit-streamed signal, but at the same time I don't consider the inability to bit-stream out of the PS3 a 'failing' in any way, just as I don't consider the lack of IR a failing - these are simply use-case conveniences/perks. As the entirety of the decoding and streaming process for these lossless codecs on the PS3 remains within the digital realm, in my eyes the job is getting done and getting done as well as possible. And honestly, I'd probably prefer the PS3 to handle all decode operations given the choice, simply because as the 'center' of an arrangement, the PS3 is clearly the more dynamic/advanced component.
People who want the ultimate fidelity will eventually opt for a high-end player which can pass the bitstream to some exotic hardware which will decode and convert the streams at a premium price.
PS3 is going to be good enough for the mass market.