Black XB360 120 Gig HDD + HDMI = $479

Status
Not open for further replies.
Usually the claims are on blind A-B tests of the same sound clip, which are stupid because you have a direct comparison that you'll never have in a real listening situation.

These tests should be done with a hundred different audio clips that are each either compressed or uncompressed, and the test subject should assess whether it sounds compressed or "wrong" or whatever. Then do statistics afterwards on how many were guessed right.


I disagree. The statement by Robert was that it was stupid to say uncompressed PCM sounded better than a lossless compressed codec because by definition it had the same data. Whether that difference is substantial or not doesn't matter. If someone can tell the difference in a blind test (with the naked ear no less) even in contrived test situations, then Robert's claim is simply untrue. Obviously for vast majority of people this wouldn't be an issue at all as they'll never be able to notice a difference.
 
If someone can tell the difference in a blind test (with the naked ear no less) even in contrived test situations, then Robert's claim is simply untrue.
Like I already said, relying on blind tests in this situation isn't totally valid. In this case the differences are supposed to be extremely minute or non-existent (as Dolby TrueHD is supposed to be 100% lossless), and the person doing the blind test is already "jaded" in the fact that he/she is supposed to be comparing/contrasting two audio samples, so he/she may be "hearing" differences that aren't really there. We also don't know whether all the variables of the test are the same (sans the digital source of course) and that the equipment is functioning as it should to properly reproduce the source. Professional equipment needs to be used to make sure the decoded Dolby TrueHD stream is bit for bit identical to the uncompressed PCM stream, as that is its claim for being 100% lossless. If it is then we can clearly blame the blind test "results" on a flawed setup or placebo effect (the placebo being the notion the person has that there could be a difference). If it isn't bit for bit identical then some claims people have that they can hear a difference might have some merit.
 
Last edited by a moderator:
Like I already said, relying on blind tests in this situation isn't totally valid. In this case the differences are supposed to be extremely minute or non-existent (as Dolby TrueHD is supposed to be 100% lossless), and the person doing the blind test is already "jaded" in the fact that he/she is supposed to be comparing/contrasting two audio samples, so he/she may be "hearing" differences that aren't really there.

Again I disagree. If he's hearing stuff that isn't there as you say, then he's basically guessing so he has a 50% chance of being wrong. So if the test is reapeated an appropiate number of times it's easy to see if his correct responses are significantly over 50%. If they are, then you can conclude with the significance level of your choosing there's indeed a difference, otherwise the person wasn't able to tell one from the other.
 
Again I disagree. If he's hearing stuff that isn't there as you say, then he's basically guessing so he has a 50% chance of being wrong. So if the test is reapeated an appropiate number of times it's easy to see if his correct responses are significantly over 50%. If they are, then you can conclude with the significance level of your choosing there's indeed a difference, otherwise the person wasn't able to tell one from the other.

I would agree if these people with "golden ears" could tell the difference with 100% certainty. If not, then it is guessing even if they happened to guess right more often than wrong.

I'd like to see a link to the test though before commenting further.
 
especially those fan**** (aka ***boys). they always say something like "PS3 is best lolz because it has HDMI 1.3" which convinces me that over 80% of people on game sites have HDMI output

when in real life they play games on 14' CRT 480i tv :smile:

anyway, is this HDMI 1.2 that bad for casual gamers??? i have always thought microsoft is currently making a games console for majority due to trying their best to become cost friendly to casual gamers, i even appreciate the fact that they are putting digital video + digital sound.

Well check this, HDMI 1.3 is better than HDMI 1.2 :)

But thats not even the deal here, considering that they added HDMI but didn´t go all the way so they at least supported all the HDMI 1.2 functions, including uncompressed PCM Sound it´s fair to say that this was not an update to support HD-DVD but an investment in the future where HDMI will be the only option on alot of televisions. Compared to standalones the 360 is a crippeled HD-DVD player.
 
Your scenario about Joe Sixpack setting up his network applies equally well to 802.11g, except replace "no signal" with "everything I watch stutters" or "my connection drops every time the microwave turns on".

Yeah, Joe Sixpack's connection stutters momentarily while the microwave is on, assuming his microwave a) is a shitty one (I can run my notebook in kitchen with practically no throughput issues, practically no frame loss), and b) his PS3 is installed in his kitchen, or his house has a shitty layout that puts his PS3 or AP within 8 feet.

On the other hand, the proposed 802.11a solution *always sucks*, not momentarily when heating left overs. Joe Sixpack goes out and buys an 802.11a AP, puts it in a different room, and dry wall, furniture, and appliances plunge his signal-to-noise to the depths.

I mean, talk about paying for stuff most people won't use. Atleast MS could have put in 802.11n if they wanted to make consumers pay a premium for Wifi.
 
You even have people claiming that uncompressed PCM sounds better than losslessly compressed codecs such as DTS-HD MA and TruHD. That's the equivalent of saying that a zipped up file is somehow worse than the original.....

Or it´s someone claiming that the decoding process / chips doing the work on lossless formats sounds worse than the Bitstream coming from the disc (Jitter maybe?), or that the encoding hasn´t been done on the exact same sources. In theory there shouldn´t be any difference (except maybe to True HD advantage being that source can be in a higher bitrate and depth) but you can´t always compare stuff like this to .zip files :)

As with AC-1 vs DTS it will take some time before we know the "truth" :)
 
yes, it absolutely supports DD 5.1

the difference here is that 95% of users will not either have the equipment to notice nor be able to tell the difference between this 1.2 vs 1.3 nonsense :LOL: as it only applies to audio of HD DVD playback of movies which is the least used of the functions of the 360.


Xbox 360 Elite supports HDMI 1.2 profile. For audio, you can select DD, DTS (at 1.5 Mbps), and WMA-Pro (Microsoft high fidelity multi-channel codec supported in some AVRs such as Pioneer). Since it is not based on 1.3, it will not support output of DD+ or TrueHD

http://forum.beyond3d.com/showpost.php?p=959881&postcount=510


Has it been confirmed that the spring HD-DVD Update (which is seperate & apart from the Spring Update) will apply to current 360's? I would think that it would, but in Amir's post he is ambigouis, vis-a-vie the current consoles, since he referred to the Elite by name.
 
I mean, talk about paying for stuff most people won't use. Atleast MS could have put in 802.11n if they wanted to make consumers pay a premium for Wifi.
How? the thing is just now being approved. (wikipedia is showing that the proposal will be formally approved this month.) The Wifi adapter had to be engineered like what, 2 years ago?

What's weird is that my 802.11a router sits in my bedroom and somehow makes it through the drywall to the rest of the house. And I disagree with downplaying 802.11g interference issues (shitty microwave? Try the millions of telephones operating on 2.4ghz).

The simple solution is: if you don't want 802.11a, don't buy the Microsoft Wifi adapter.
 
Has it been confirmed that the spring HD-DVD Update (which is seperate & apart from the Spring Update) will apply to current 360's? I would think that it would, but in Amir's post he is ambigouis, vis-a-vie the current consoles, since he referred to the Elite by name.

If you are referring to the update that allows users to select DTS instead of Dolby Digital for HD DVD audio output then yes that update applies to all 360's.
 
The inquirer: X360 Elite lacks PCM audio

And may have Blu-ray add-on

For starters, the 360 Elite will only support the HDMI 1.2 specification, as opposed to 1.3, which is the latest and greatest (and ships with the PS3). 1.3 adds support for the transmission of Dolby TrueHD, which is a lossless multi-channel audio (PCM) format being used on top-end HD-DVD discs.

Additionally, when using HDMI the resolution on the 360 will be forced to 1080p, regardless of the display resolution of the device it is actually being connected to, which could result in unnecessary image downscaling or, at worst, lack of picture.

However, that's not all the interesting Elite gossip. Whilst the decision not to include an HD-DVD drive in the unit has annoyed some, Microsoft seems to be suggesting that its because a Blu-ray add-on unit could be made available for the console at a later date. In an interview with GamesIndustry, Europe boss Neil Thompson suggests that when it comes to movies, "Im not sure the market has moved to high definition yet... And if and when it does, then the way that we’ve constructed the offering we’ve made means we’ll be able to go whichever way we want."

Is that a hint at an add-on Blu-ray player, should Sony's proprietary format win the HD war?
 
No surprise there, it's the logical step and Peter Moore said as much in an interview previously.

But, they will wait as long as possible, in an effort to extend the format war which works in their favour.
 
Additionally, when using HDMI the resolution on the 360 will be forced to 1080p, regardless of the display resolution of the device it is actually being connected to, which could result in unnecessary image downscaling or, at worst, lack of picture.
I've been trying to track down where that particular one has come from. I couldn't see it in Amir's linked post and others are saying that the HDMI output removes the user resolution selection, instead defaults to the "native" resolution of the display based on the EDID, which would make more sense.
 
I've been trying to track down where that particular one has come from. I couldn't see it in Amir's linked post and others are saying that the HDMI output removes the user resolution selection, instead defaults to the "native" resolution of the display based on the EDID, which would make more sense.

Amir's response:
http://www.avsforum.com/avs-vb/showthread.php?p=10191280&&#post10191280
"Follow up on the 1080p mention in the Inquirer. Their data definitely seems incorrect. You can choose 480p, 720p, 1080i, or 1080p (assuming the EDID indicates support) or you can choose “optimal resolutionâ€￾ which will be set to the monitor’s preferred timing (and assuming that that preferred timing matches a resolution we support (e.g. 640x480, 848x480, 1024x768, 1280x768, 1280x720, 1360x768, 1280x1024, 1920x1080))."
 
Status
Not open for further replies.
Back
Top