PS3 Slim Hardware Confirmed

Or even they want the PS3 on top for everyone to see! This new design actually facilitates that. TBH I don't know why people want this console to be stackable when none of the others have been. Save XB. Who here stacked their PS2/Wii/GC/SNES/Megadrive/NeoGeo etc.?! ;)

I stacked my PS2. :p Put the Xbox on top of it since it wasn't as easy to stack on top of the Xbox.

With previous consoles, I only had one of each at any given moment so stacking wasn't a problem. But yes the PS1 was a PITA with the top load as I had to pull it out of the cabinet everytime I wanted to insert a CD. But that was still better than not being able to even have a console in the cabinet when playing with some of the cartridge based systems.

Now however, I'm not sure what to do. I have limited space for home electronics in a vertical enclosure. I'm not about to buy a new multi-100's of USD home electronics enclosure just to support the non-stackable nature of the PS3, which produces a connundrum.

Both the X360 and Wii are stackable but the Wii is pretty small so it only makes sense to stack it on top or a small dvd player on top of the Wii.

I suppose I could put the PS3 on top of the Wii, but that would just look silly. I suppose when I get one, I can also install everything to HD and then stick it behind the cabinet with the X360 power brick, but that would be a PITA.

As it is, with my current setup, there is no way to place a non-stackable unit that is aesthetically pleasing.

That still won't prevent me from buying one once it drops to 199 and I can budget it in, but it's a problem I would love to have not had to deal with.

Regards,
SB
 
Ah ok I always wondered about that. I did an a/b comparison between my pc (which does downsample pcm audio to 16/48) and the PS3 with The Police concert blu-ray and could not tell the difference in audio. So I figured maybe there was some truth to word that the PS3 also downsampled when using pcm. I guess though that I'm just not all the sensitive to audio. Either way, the slim has now rendered most stand alone blu-ray players irrelevant.

I was told your audio gig needs to support 24bit/96kHz LPCM first.

PS3 should be able to output 7.1 24bit/192kHz over HDMI. You can select it from your Sound Settings.
 
Here's an article on that:

http://www.engadgethd.com/2009/08/21/ps3-slim-bitsreams-dolby-truehd-and-dts-hd-ma-audio-at-last/

As I understand it, without bitstreaming the player must resort to decoded pcm streams which are capped at 16bit/48khz. So if you play say a concert blu-ray that is 24bit/96khz it gets slightly down converted before being sent out. Or so I've read, I was never sure if the PS3 had this limit. Bitstreaming on the slim gets around all that anyways.

This was the case with the XBOX, maybe thats where you heard about it. With the PS3 there aint no such constraints.
 
I was told your audio gig needs to support 24bit/96kHz LPCM first.

It should, my amp is the Sony 5300es.


-tkf- said:
This was the case with the XBOX, maybe thats where you heard about it. With the PS3 there aint no such constraints.

Nah it was definitely about the PS3 on avsforum. But it might have just been people trying to justify the purchase of their crappy stand alone blu-ray players by falsely skewing facts to make the PS3 look worse.
 
The PS3 never downsampled HD audio. The reason why bitstreaming is important to audiophiles, is because they'd rather have their high-end AVR's decode the audio than having the PS3 do it as they believe the DAC in their AVR is better than the one in the PS3. To most people (even some audiophiles) there isn't a noticeable difference.

The DAC in the PS3 has nothing to do with it (it's only 2 channel anyway), LPCM is digital. The reason is (most probably) jitter, HDMI is a lot more suseptable to jitter than SP-DIF.
 
The DAC in the PS3 has nothing to do with it (it's only 2 channel anyway), LPCM is digital. The reason is (most probably) jitter, HDMI is a lot more suseptable to jitter than SP-DIF.
Yeah sorry, I didn't mean DAC, I meant processor. I doubt jitter is the reason... the PS3 cannot pass multi-channel LPCM though SPDIF (only 2-ch), only HDMI. Most people with high-end AVR's prefer bitstreaming over LPCM because they believe their AVR's process the audio better than the PS3.
 
Last edited by a moderator:
I don't know if the last post on this page (by HaroonMir) helps:
http://www.agoraquest.com/viewtopic.php?topic=32037&forum=51

My setup is significantly simpler than theirs. I only use hdmi for every component, so a single hdmi cable from the PS3 to the amp and that's it. I only have a 5.1 setup as well. I recall the PS3 info bar at the top saying 96khz during playback, but I don't know if that meant what the source is or what it was sending out. 99% of blu-rays movies are 48khz anyways so I wasn't really worried much about it too much. I've been told that the difference between 48khz and 96khz is pretty hard to hear anyways, and I'm not a huge audiophile so maybe I just can't tell.
 
Yeah sorry, I didn't mean DAC, I meant processor. I doubt jitter is the reason... the PS3 cannot pass LPCM though SPDIF, only HDMI. Most people with high-end expensive AVR's prefer bitstreaming because they believe they process the audio better.

sorry should have written PCM.

HDMI as a method to transfer PCM is more suseptable to jitter than SP-DIF. If you read the audiophile mags they measure the jitter on the digital outs, and IIRC even with 2 channel stereo HDMI is an order of magnatude worse, even the really high end stuff suffers badly, although not to the same extent as the cheap Onkyo's and Yamaha's. I expect there is a wiki on HDMI jitter. There should certanly be no difference in the decoding.
 
But is the basic premise of how jitter can occur correct? Taken from avsforums:

Could not agree with you more Michael Grant. There is so much mis information online about how bitstreaming over HDMI eliminates jitter, I am glad to see that someone else has picked up on the fact that the audio is stored in the blanking interval of the video stream, and that if jitter is infact occuring, it would be happening to audio stream(the entire stream i general including the video) before it even reached the receiver for processing.
 
sorry should have written PCM.

HDMI as a method to transfer PCM is more suseptable to jitter than SP-DIF. If you read the audiophile mags they measure the jitter on the digital outs, and IIRC even with 2 channel stereo HDMI is an order of magnatude worse, even the really high end stuff suffers badly, although not to the same extent as the cheap Onkyo's and Yamaha's. I expect there is a wiki on HDMI jitter. There should certanly be no difference in the decoding.

Oh I see... you may be right. By SPDIF, I thought you meant TOSLINK for some reason, hehe.

Yeah, apparently if you bitstream over HDMI, you won't be affected by jitter which may affect sound quality if you send PCM over HDMI.
 
Oh I see... you may be right. By SPDIF, I thought you meant TOSLINK for some reason, hehe.
well yeah, either the digital or optical interface used commonly for stereo (CD transports to DACs, etc.) and AC3 and DTS (DVD players to AVR's and processors).

Yeah, apparently if you bitstream over HDMI, you won't be affected by jitter which may affect sound quality if you send PCM over HDMI.
That I beleive is the general consensus amoungs the egg heads.
 
Is it true though ? Is bitstream over HDMI has less jitter than PCM over HDMI ? Anyone done some measurement ?
 
Is it true though ? Is bitstream over HDMI has less jitter than PCM over HDMI ? Anyone done some measurement ?
If you output the audio as LPCM, the audio is already decoded and processed by the player and sent to your AVR so the jitter may affect the sound quality when it gets passed through HDMI. If you bitstream the audio, it will be untouched and left for your AVR to do the decoding/processing duties so it's less susceptible to jitter. Jitter is present, the question is whether you can hear the effects of jitter or not.
 
Last edited by a moderator:
So basically AVR can deal with the jitter introduced by HDMI and PS3 when passed as bitstream but not as LPCM ?

I've read some paper that claimed that human can hear jitter upto 1 microsecond. So if the jitter measurement is less than 0.4 microsecond we shouldn't hear the difference.

But jitter is measurable. Most of those AVRs or audio equipments reviews sites don't conduct any measurement and rely on their ears. The problem is their ears are broken since for the same product, different reviews said different things.

So any study actually provided measurement to confirm the jitter of LPCM Vs Bitstream over HDMI ?
 
I think it's less susceptible because TrueHD/DTS-MA are essentially compressed formats; because they're compressed, jitter doesn't affect the stream until it's uncompressed/decoded.

Not sure if there have been any tests. I doubt many people can notice the difference. Again, this is for the audiophiles. For the rest of us, it just means the indicator lights will come on.
 
Jitter is irrelevant. It either destroys an enitre bit or it does nothing at all. Destroying a random bit in an audio signal isn't a graceful process. There'd be no preference for the lower-significance bits, but your high-significance bit (+-half of your maximum amplitude) has the same chance to randomly invert as any other: zero, typically. The whole point of digital transmission is that analog noise, distortion and interference can't change your signal. At all.

If jitter was strong enough to affect your audio signal, your ears would fall off and/or a cat would jump through your window straight onto your head and try to kill you.
If we pretend that jitter was a relevant concern, which it isn't, but let's say, you operate your audio equipment inside an experimental fusion reactor, it would distort bitstream data more than pcm data, because in a compressed representation one randomly inverted bit before the decoding stage would lead to multiple wrong bits after expansion, while in an uncompressed stream there is no such amplification of error.

I call hogwash.
 
Jitter is irrelevant. It either destroys an enitre bit or it does nothing at all. Destroying a random bit in an audio signal isn't a graceful process. There'd be no preference for the lower-significance bits, but your high-significance bit (+-half of your maximum amplitude) has the same chance to randomly invert as any other: zero, typically. The whole point of digital transmission is that analog noise, distortion and interference can't change your signal. At all.

If jitter was strong enough to affect your audio signal, your ears would fall off and/or a cat would jump through your window straight onto your head and try to kill you.
If we pretend that jitter was a relevant concern, which it isn't, but let's say, you operate your audio equipment inside an experimental fusion reactor, it would distort bitstream data more than pcm data, because in a compressed representation one randomly inverted bit before the decoding stage would lead to multiple wrong bits after expansion, while in an uncompressed stream there is no such amplification of error.

I call hogwash.


Jitter isn't destroying bits. It's similar to network lag where some packets arrive later and some arrive sooner. Hence the sound would be corrupted. It is very trivial to remove jitter with a small buffer. Buffer could be for example 42ms which would coincidelly be enough to store audio for single frame in 24p stream. After this buffer is in place it eliminates all jitter that doesn't cause buffer underrun. User would be non wiser on the whole matter because it doesn't really matter if image and audio come 1 frame late when watching a movie. In fact modern tv's have much higher lag in range of 100ms+ when using the image enhancing algorithms.

In practice jitter is just unfounded fear similar to 100000e cables and other superstition some hifi people tend to believe into. Even if you would believe there is jitter in range of tens of microseconds you might be surprised when you think what kind of effect moving your head has... mainly, your natural head movement would cause bigger artifacts than the jitter in question.

If you output the audio as LPCM, the audio is already decoded and processed by the player and sent to your AVR so the jitter may affect the sound quality when it gets passed through HDMI
At least ps3 doesn't really process audio, it only decodes it. Processing should be done on the amplifier(i.e. audyssey room correction, bass boost, delay, whatever). Only audio processing ps3 can do is to change the volume level and that is inherently broken and should not be used.

Moving ones head 2cm would cause a time difference of 2cm/340000cm/s which is 58microseconds difference to sound heard in the original place and the place ear ended up. Good luck eliminating jitter by making headstands that remove all movement. I think this should show how absurd the discussion about jitter is on practical level. ps. and don't confuse this to effect one gets with highly directional loudspeakers where sound actually sounds drastically different even after small head movement, it's alltogether different matter.
 
Last edited by a moderator:
If you output the audio as LPCM, the audio is already decoded and processed by the player and sent to your AVR so the jitter may affect the sound quality when it gets passed through HDMI. If you bitstream the audio, it will be untouched and left for your AVR to do the decoding/processing duties so it's less susceptible to jitter. Jitter is present, the question is whether you can hear the effects of jitter or not.

No, it'll still be "touched" but since it still has processing to go there's still a chance that any jitter could be filtered out. This is a huge stretch though and the reality is more as you said, it's audiophiles wanting their big $$$$ receivers processing it than their "game" console.
 
Back
Top