Benefits(?) of HDMI vs Component in the high-def consoles

the answer is it depends.

On what?

The source is digital (DVD, BD, games, etc.), you think doing a D/A for transmission, then a A/D in the display can match a pure digital signal from source to display? I think the correct statement is "component is close enough to digital in most cases depending on implementation".

Do any HTPC users out there prefer component to DVI? The days of playing with Powerstrip and overscan are dead when you use DVI and 1:1 pixel mapping display, long live digital IMO.
 
What exactly about HDMI is better than Component?

You can find the spec everywhere on the Web, showing why HDMI is better than component (bandwith, color space, etc.). But i am sure this is not what you want...

In Videos DVD, HDDVD or blu-Ray, the signal is digitally encoded.
On the other side of the path, most of HD displays are digital. When you use analog connection (aka VGA or component), you have to perform on your source device an digital to analog conversion and put the signal on your cable. Arrived on the display device, the signal have to be converted back to digital...

When using DVI or HDMI, you never perform these conversions. The signal is simply "packed" ala TCPIP from one device to the other. The fact is component have the bandwith to deal even with 1080p but with good cable, HDMI suffers less loss on the signal.

If you have "poor" quality source (XBox360, PS3, ...) and a limited in size display ( <=32'' display), its difficult to see a difference.
But if you :
- Use a high quality source (aka HTPC, high end DVD player)
- Use a big screen (i don't know how you translate 2.40m basis to inches)
Believe me, the difference is visible even for newbies.

Go look at AVS forums and look the ratio between digital / analog signal users.

PS : its a whole discussion, and i am not quite sure this is the right place ...
 
On the other side of the path, most of HD displays are digital. When you use analog connection (aka VGA or component), you have to perform on your source device an digital to analog conversion and put the signal on your cable. Arrived on the display device, the signal have to be converted back to digital...

When using DVI or HDMI, you never perform these conversions.

that's part of the problem... many displays still do convert on the other end.
 
PS : its a whole discussion, and i am not quite sure this is the right place ...

It's not, so the option is people get back on topic (well, more on topic), or we hack this piece of the conversation off into its own thread. Which would folk prefer?
 
It's not, so the option is people get back on topic (well, more on topic), or we hack this piece of the conversation off into its own thread. Which would folk prefer?
I don't even remember what this thread is about! :LOL:
 
Each revision is a superset of the previous one. Equipment with support for earlier revisions will still work, it will just be limited to the functionality enabled by that revision. No worse off here than someone using a component/S/PDIF connected device.

This is just wrong. HDMI1.2+ has a very strong advantage in its ability to pass high-resolution multichannel PCM. Of lesser importance is the fact that instead of a digital>analog>digital path for the video signal you maintain a digital signal throughout. Does this present a noticable difference? That is debateable, which lessens this as an advantage. Most times you see a set of images showing a dramatic difference the person making the demonstration didn't bother to re-calibrate the display after switching connections. The rest are likely showing a flaw in the display device's handling of analog signals. But it is still preferable to stay digital, given the option. The single-cable connection is also a big win for anyone who has spent any time trying to set up or modify a complex HT system. I have had to make some RubeGoldberg-ian connections sometimes to get desired functionality out of some systems. HDMI is a breath of fresh air.

My "standard" comment is aimed strictly at one who is looking for a standard, not a moving goal post. If you drop big money on what is believed to be the "it" standard on a TV, reciever, or processor that matches this "standard" only to find out the goal post is moving yet again not a month later, what do you call that? Add to this the issue of incompatable hdmi devices. How many incompatable Component devices have you heard of? hmm how about spdif? ... Svideo? composite video? This "standard" is rediculous.

Regarding the convienience. Yeah, one cable is easier than two. Why this is reason to switch to a tech or even notable in this discusion is beyond me.

As far as DVI being too open for the content providers; DVI supports the same copy protection as HDMI (HDCP). In fact I believe it was carried over *from* DVI when it was "tweaked" to become HDMI. You do know that HDMI and DVI are essentially they same technology? HDMI mainly just adds the ability to carry audio, a smaller profile connector and (as Dave alluded to in his post) the ability to carry higher resolution signals without having to use a dual-link connection. All things that make it more sutable as a CE connection than DVI which was designed around the types of connections required by PCs.

DVI: Yes I'm aware DVI was built upon for hdmi. You are aware that not all DVI is copy protected right? You are also aware that all hdmi is hdcp enabled right?

As for S/PDIF being just as good as HDMI's audio support; Can S/PDIF pass 7.1 channel 24bit/96kHz audio? And if you think running 8 analog RCA cables is a resonable workaround than I'm afraid I can't agree.

SPDIF: You're telling me they couldn't develop a spdif2.0 standard which could use the same exact cables and have enough bandwidth for any audio standard for the foreseable future?

HDMI is a very good thing for the consumer. HDCP not so much, but that is a completely seperate issue.

That's the point, they aren't seperate issues. Component offers the same ability for consumers. DVI offers the same all digital connection. Neither one is built around copy protection though, unlike HDMI.

/offtopic
 
Show me a TV does do HDMI worse that component then.

This is a widely known issue with HDMI. For you to be totally unaware of it's existence tells me that basically have no clue what you're talking about. So, lets agree to disagree because this is pointless.
 
Last edited by a moderator:
that's part of the problem... many displays still do convert on the other end.

Many? Maybe you are confusing a display that scales/deinterlaces a signal and does it in the analog domain (which still seems silly). A 1920x1080 digital signal from a HTPC or PS3 has zero reason to go through a D/A and A/D inside a 1080P digital display.
 
that's part of the problem... many displays still do convert on the other end.

That might be only way HDMI could be brought down to component levels, something I haven't thought of.

Still, that should imply that HDMI = component, it could never imply HDMI < component unless something is completely wrong.
 
On what?

The source is digital (DVD, BD, games, etc.), you think doing a D/A for transmission, then a A/D in the display can match a pure digital signal from source to display?
Damn, i did not see you said that (my previous post)

I think the correct statement is "component is close enough to digital in most cases depending on implementation".
Agreed.

Do any HTPC users out there prefer component to DVI? The days of playing with Powerstrip and overscan are dead when you use DVI and 1:1 pixel mapping display, long live digital IMO.
I totally agree to that (as a HTPC user).
But as XBox360 and PS3 are seen as "high end Multimedia" devices, most could have difficulties to understand that for audio and video they can only be considered as low end (mid end at best).
 
Still, that should imply that HDMI = component, it could never imply HDMI < component unless something is completely wrong.
That would depend on the DACs. If panel is still natively analogue and its DAC is of lower quality than the DAC in the source device then, for that user, its likely that component would give a better quality.
 
I dunno about you guys but i like audio, but the sad truth is that nor,al low bitrate AC-3 + DTS is what you get with the standard Optical Out. The future (oh do i like that phrase?) is HDMI. Sadly i can´t talk about how great that is but i can say i look forward to uncompressed PCM from my BluRay titles, finally killing low bitrate surround sound.

My component cables are professional first grade broadcast stuff, they rock, the picture looks amazing. It´s also 3 cables that i could exchange with one cable costing less (weird but true) and sustaining the quality over 10 meters. HT freaks spend "millions" ok lots of money on component cables to yield every last ounce out of them because when it´s analog there is signal loss.

I´m actually defending Microsoft XBOX 360 1337 here, so i should stop. Good night and enjoy the GTA 4 trailer while i sleep and dream of a Microsoft broken up by the US goverment..
 
I dunno about you guys but i like audio, but the sad truth is that nor,al low bitrate AC-3 + DTS is what you get with the standard Optical Out. The future (oh do i like that phrase?) is HDMI. Sadly i can´t talk about how great that is but i can say i look forward to uncompressed PCM from my BluRay titles, finally killing low bitrate surround sound.

My component cables are professional first grade broadcast stuff, they rock, the picture looks amazing. It´s also 3 cables that i could exchange with one cable costing less (weird but true) and sustaining the quality over 10 meters. HT freaks spend "millions" ok lots of money on component cables to yield every last ounce out of them because when it´s analog there is signal loss.

I´m actually defending Microsoft XBOX 360 1337 here, so i should stop. Good night and enjoy the GTA 4 trailer while i sleep and dream of a Microsoft broken up by the US goverment..

What's your audio equipment that you're making great use of this great lossless audio and can hear a clear difference?

Is your surround calibrated also?
 
As for Component vs HDMI for GAMING, the difference should be next to none, unless your display does a horrible job of processing one or the other.

Why? games, even in HD, fall well behind in clarity/sharpness/realism compared to a good HD feed of real life material. So really, you're just convincing yourself of the PQ differences.
 
Last edited by a moderator:
That would depend on the DACs. If panel is still natively analogue and its DAC is of lower quality than the DAC in the source device then, for that user, its likely that component would give a better quality.

Agreed. (sorry for a one word response :smile: )
 
That would depend on the DACs. If panel is still natively analogue and its DAC is of lower quality than the DAC in the source device then, for that user, its likely that component would give a better quality.

Possible, but maybe not so clearly cut. The DAC would need to be so bad that a native 1080p signal from the HDMI would still be inferior than the 720p signal from the component. It's unclear if such a TV exists, and if does, it likely would be either very old or a really low end one. I asked for an example of a TV where this is the case and I haven't gotten one. Not to say that such a TV can't exist, butthat it is likely to be very rare at best.
 
Possible, but maybe not so clearly cut. The DAC would need to be so bad that a native 1080p signal from the HDMI would still be inferior than the 720p signal from the component. It's unclear if such a TV exists, and if does, it likely would be either very old or a really low end one. I asked for an example of a TV where this is the case and I haven't gotten one. Not to say that such a TV can't exist, butthat it is likely to be very rare at best.
Why 720 from component and not 1080 (assuming the set accepted 1080 over component)?
 
Why 720 from component and not 1080 (assuming the set accepted 1080 over component)?

So few HDTVs accept 1080p from component that I can't imagine any one of them doing any sort of quality degrading digital to analog conversion. They're usually high-end enough to pretty guarantee proper handling of HDMI. Frankly, the whole thing is a red herring, since HDMI is suppose to explicitly prevent any sort of D-to-A conversion.
 
That would depend on the DACs. If panel is still natively analogue and its DAC is of lower quality than the DAC in the source device then, for that user, its likely that component would give a better quality.

And to underscore the point...let's not forget the whole class of HDTV diplays that are by default natively analoge...CRT HDTVs.
 
Back
Top