Benefits(?) of HDMI vs Component in the high-def consoles

And to underscore the point...let's not forget the whole class of HDTV diplays that are by default natively analoge...CRT HDTVs.

Which are nearly obsolete, have no native resolution, do not support 1:1 pixel mapping and thus don't benifit as much from a digital connections.

Flat panel HD-TV sales are going nuts right now, all have and benifit from HDMI.
 
So few HDTVs accept 1080p from component that I can't imagine any one of them doing any sort of quality degrading digital to analog conversion. They're usually high-end enough to pretty guarantee proper handling of HDMI. Frankly, the whole thing is a red herring, since HDMI is suppose to explicitly prevent any sort of D-to-A conversion.
How's it a red herring again? I never suggested that a 1080p signal fed over HDMI would look worse than a 720p signal fed over component, and frankly that hasn't been the discussion at all. The question was whether or not one could argue categorically that HDMI was better than component. I've said no and served up at least one document supporting my case.

Also, are you are suggesting that a device that supports HDMI is expressly forbidden from performing D-to-A conversion?
 
Which are nearly obsolete, have no native resolution, do not support 1:1 pixel mapping and thus don't benifit as much from a digital connections.

Flat panel HD-TV sales are going nuts right now, all have and benifit from HDMI.
Weird. The plasma I just bought didn't have an HDMI in. :???:

(Ok, to be fair, that was the commercial model and it actually came with BNC connectors, though the company was nice enough to include RCA component adapters.)
 
HDMI is sharper and has less "noise" and artifacts but i think that in terms of colour reproduction Component beats HDMI hands down.
 
HDMI is sharper and has less "noise" and artifacts but i think that in terms of colour reproduction Component beats HDMI hands down.

Sounds like different calibration for the two inputs. The digital information gets to the display 100% intact and identical to the source and somehow the approximation that component gives is better?
 
Sounds like different calibration for the two inputs. The digital information gets to the display 100% intact and identical to the source and somehow the approximation that component gives is better?

I would'nt say that, if you get interferance? ( spelling? ) and you lose some of the data being sent over HDMI you CANT get it back, were as on analog ( component ) there is a few ways to "retreive" some of the lost information.
 
I would'nt say that, if you get interferance? ( spelling? ) and you lose some of the data being sent over HDMI you CANT get it back, were as on analog ( component ) there is a few ways to "retreive" some of the lost information.

I'm pretty sure the transmission includes error correction and if enough of the bits are corrupt you get zero signal, in other words it's a binary error (you have 100% or 0%) unlike analog which has continuous error.

Edit: I think I'm wrong (it happens a lot).

http://www.jacobsen.no/anders/blog/...tion_do_your_cables_make_a_difference_v2.html

It looks like DVI/HDMI do not have error correction and poor cables or long distances can corrupt the bit stream. I doubt this happen in the typical 6'-10' home install.
 
I'm pretty sure the transmission includes error correction and if enough of the bits are corrupt you get zero signal, in other words it's a binary error (you have 100% or 0%) unlike analog which has continuous error.
Agreed, though I again go back to the fact that signal noise is an environment issue and solvable, thus orthogonal to whether HDMI is better than component.
 
Thanks to this thread, I've purchased an HDMI board for my plasma (which is what my PS3 is hooked up to). It won't settle any debates, regardless of the results, but my curiousity was piqued enough to spend a stupid amount of money just to test it. :D
 
My "standard" comment is aimed strictly at one who is looking for a standard, not a moving goal post. If you drop big money on what is believed to be the "it" standard on a TV, reciever, or processor that matches this "standard" only to find out the goal post is moving yet again not a month later, what do you call that? Add to this the issue of incompatable hdmi devices. How many incompatable Component devices have you heard of? hmm how about spdif? ... Svideo? composite video? This "standard" is rediculous.

So I should have been mad when my 4x DVD burner was superseded by models and media that would burn @8X and then 16X? Even though my burner could still burn at 4X with the newer media, the same as when I bought it? How about when my receiver which could do component video switching but only of SD sources was replaced by models with the ability to handle higher bandwidth connections and that therefore could switch HD sources as well? Standards very often evolve to accommodate the increased capabilities of the devices that utilize them. HDMI is hardly unique in this regard. As for the compatability issues; even the best standards can suffer from poor implementations.

Regarding the convienience. Yeah, one cable is easier than two. Why this is reason to switch to a tech or even notable in this discusion is beyond me.

You intimated that HDMI was only developed as a means to force copy protection on the consumer. This is just one of its added features over DVI that shows that to be an incorrect statement.

DVI: Yes I'm aware DVI was built upon for hdmi. You are aware that not all DVI is copy protected right? You are also aware that all hdmi is hdcp enabled right?

You are aware that there are applications for DVI that don't require passing protected content right? Can you think of any common usage scenarios for a device that will both benefit from the added features of HDMI over DVI and *not* also have the need to pass protected content?



SPDIF: You're telling me they couldn't develop a spdif2.0 standard which could use the same exact cables and have enough bandwidth for any audio standard for the foreseable future?

And the advantage to this approach over the added audio functions to HDMI are? You would need to replace both the transport and the playback devices to support the new features either way. And there's very likely to be an S/PDIF port on any HDMI-equipped device for the forseeable future, so legacy compatibility is not really an issue.


That's the point, they aren't seperate issues. Component offers the same ability for consumers. DVI offers the same all digital connection. Neither one is built around copy protection though, unlike HDMI.

The fact that HDCP is mandatory on HDMI is a result of it's intended application. It is most commonly going to be used to connect components that will have a need to pass protected content. In this scenario it is an advantage to the consumer to know that if he hooks up an HDMI playback device to an HDMI display/sound system that it will always be able to perform its intended function.

All digital connections that have a need to pass protected video content have some provision for copy protection. DVI/HDMI have HDCP and FireWire has DTCP. It's not a mandatory requirement for FireWire or DVI because there are applications for them that don't require passing protected content. But when it *is* a requirement they have it, too. HDMI didn't bring this to us, it's been here for years.
 
I never suggested that a 1080p signal fed over HDMI would look worse than a 720p signal fed over component, and frankly that hasn't been the discussion at all.
I'm curious as to where this line of reasoning has come from as well.

Frankly, the whole thing is a red herring, since HDMI is suppose to explicitly prevent any sort of D-to-A conversion.
No - analogue displays that have an HDMI connection will have to perform a digital-to-analogue conversion; there are even relatively new sets where this needs to be done.
 
The fact that HDCP is mandatory on HDMI is a result of it's intended application.
Actually, I believe its not mandatory at all - I don't believe there is anything in the HDMI specification that stipulate HDCP, just the application makes it unlikely not to be linked.
 
The fact that HDCP is mandatory on HDMI is a result of it's intended application.

Agreed. I don't like the way the standard has been handled thus far and I don't like the direction this standard is going. I don't like the mandatory hdcp. I don't like the incompatable devices which did not exist in the av world in any connection device prior to this standard.

I feel like that one hit wonder band "ugly kid joe". :LOL:

I see no advantage for the consumer in this tech over DVI other than the convenience of doing away with an optical cable. :???:

I also see little advantage in this tech over component for the majority of consumers on the market.

IMO

Component/vga FTW! :LOL:

The last hurrah for analog. :cry:
 
Actually, I believe its not mandatory at all - I don't believe there is anything in the HDMI specification that stipulate HDCP, just the application makes it unlikely not to be linked.

I will have to check the spec myself. I had never read anything definitive either way and was taking TheChefO's word for it.
 
I will have to check the spec myself. I had never read anything definitive either way and was taking TheChefO's word for it.

From what I understand, the devices which are on the market must get the hdcp "hand shake" before video will be transmitted. Anything which interupts this hand shake will cause the video stream to be ceased.

While the spec may not call for it, devices which do not have it are in the minority (if at all). This hdcp issue is the biggest reason for the compatability problems which I so despise in a standard.
 
No - analogue displays that have an HDMI connection will have to perform a digital-to-analogue conversion; there are even relatively new sets where this needs to be done.

Are you even sure this is even an HDTV? It says "HD Ready" which could mean "not HD." It also appears to be discontinued.

So I'm sure if you get the absolute crappiest TV you can find, you'll find something with an HDMI shoehorned in but doesn't use it properly. However, any reasonable HDTV should not have this problem, especially if it's an LCD or Plasma.
 
Last edited by a moderator:
Are you even sure this is even an HDTV? It says "HD Ready" which could mean "not HD." It also appears to be discontinued.

So I'm sure if you get the absolute crappiest TV you can find, you'll find something with an HDMI shoehorned in but doesn't use properly. However, any reasonable HDTV should not have this problem, especially if it's an LCD or Plasma.

HD ready just means it doesn't have a tuner - fyi
 
I'm pretty sure the transmission includes error correction and if enough of the bits are corrupt you get zero signal, in other words it's a binary error (you have 100% or 0%) unlike analog which has continuous error.

Edit: I think I'm wrong (it happens a lot).

http://www.jacobsen.no/anders/blog/...tion_do_your_cables_make_a_difference_v2.html

It looks like DVI/HDMI do not have error correction and poor cables or long distances can corrupt the bit stream. I doubt this happen in the typical 6'-10' home install.

It simply can't be possible that any modern digital cable has no error correction or error detection. It's a fundamental fact that all signals carried over wires can have transmission errors in them, especially at the high frequencies an HDMI cable runs at. Error detection is also very cheap to implement, and it makes no sense not to have it.

It seems to be the case with HDMI from what evidence I've got:

http://episteme.arstechnica.com/eve/forums/a/tpc/f/67909965/m/450005880831?r=363006290831
 
Back
Top