DVI vs. Component - Image Quality?

scooby_dooby said:
I've read that DVI and HDMI were mainly created to enable greater copyright protection for publishers, and that there's not much diference between actual visual quality.

Is this true?

Where do you mention component in your post? You should work on your sentence contexts.
 
Karma Police said:
scooby_dooby said:
I've read that DVI and HDMI were mainly created to enable greater copyright protection for publishers, and that there's not much diference between actual visual quality.

Is this true?

Where do you mention component in your post? You should work on your sentence contexts.

oh I see what you're saying, without taken in context with the title the message body is a little confusing, my bad
 
scooby_dooby said:
shaderguy said:
See http://www.audiolofftreport.com/dvi_hdmi.html for some of the problems of both connection technologies.

Nice article. Selected quotes:

"any device with DVI or HDMI outputs must carry a special digital anti-piracy code called HDCP (High-bandwidth Digital Content Protection) to thwart illegal copying. HDCP is mandated for both DVI and HDMI connections and uses an authentication protocol developed by computer chip-maker Intel. It involves the HD digital cable box or HD satellite receiver sending a kind of digital “hand-shakeâ€￾ to the HD receiving device (the TV display) to ensure that it's licensed to receive the HD content. These anti-copy codes are embedded in the digital video data stream and must be removed by the video processor in the HD display. Result? The display's internal video processor has to do extra digital “workâ€￾ with no benefits to picture quality, so possible degradation may take place. Using the analog component video outputs bypasses the anti-piracy codes, so the TV display doesn't have to deal with the extra processing. "

"when the digital video signal reaches the HD display in its native format, the digital TV still has to re-clock the digital video so that it exactly matches the TV's native resolution. And according to informed sources, re-clocking digital video may produce more picture degradation and video artifacts -- not less--than a good analog component-video connection"

This is a steaming load of rubbish. Especially the part about the "extra work" of copy protection degrading the signal.
 
The difference between dvi(hdmi) and analog connection (component, svideo, composite, vga, etc) is going to be largely dependent on a couple of factors. The first is the throughput of the cable, and the second is the interference that will cause the signal to degrade. Analog connections are also dependent on the quality of the Digital to Analog and possibly Analog to Digital converters in the devices at either end of the cable.

With a digital signal it is pretty much all or nothing. If there is too much interference the signal won't decode. In most cases though, the signal will transmit as a 1-1 representation of what was sent, meaning you have no loss of quality. This method however, restricts you to a hard limitation on how much data can be sent per second, so as you up the resolution or color depth, the rate at which you can send frames decreases.

With analog cables, the quality of the picture is dependent on many factors, it mostly comes down to shielding. First, the signal must be converted from the digital signal to an analog signal inside the computer/gaming device/etc. DACs play a pretty large part in the quality of the image, especially when you are talking about pushing something like 1080p. Noise inside the computer can be a large source of signal degradation, so shielding inside the computer is very important. DACs also tend to be rated a certain MHz rating which basically says how much throughput it can handle (though this doesn't necessarily match how much the videocable can do). Next, the analog signal is sent over the cable to the display device, and again, a lot of it depends on shielding and how much interference there is. Lots of interference and poor shielding will result in a very noticably bad image. Little interference and good shielding might result in a display that is very similar to what you would see with DVI. Also, as you try to push more data over the wire (Higher resolutions/color depths/refresh rates), the quality of the display will decrease. Analog connections have more of a "soft" limit to how much data can be sent. You can certainly try to send more data over an analog connection than the cable is rated for, but the signal will become more and more corrupted as the throughput rate is increased. Finally, if any Analog-to-Digital conversion takes places (for LCDs/Plasmas), the quality of this component will play a role as well.

Just as a side note, some people on here were talking about DVI connections on CRTs. The primary reason to do this is that the CRT manufacturer can match the DAC to the CRT, and remove many of the sources of interference. Essentially, all of the places where an analog signal takes place happens inside the CRT where the manufacturer can control the environment. For a manufacturer that is trying to produce a very high quality picture, this is great because they will no longer be blaimed if the picture is poor due to some low quality DAC in the computer or if a bad cable is used.

In practical terms, for 720p a vga or perhaps a component connection is probably going to be good enough most of the time. For 1080p, I would start to be worried about VGA. BNC cables would be better, but as those are pretty much a dying breed I'd defintely want something like DVI. DRM sucks though. There should be a rule that manufacturers can't encrypt data they don't own.

Nite_Hawk
 
This is a bit off topic as well but Id like to mention that spending money and getting some good quality cables makes a difference in picure quality.

I know that some engineering type people like to live in a theoretical lala land and insist that cables make no difference but I have seen it with my own eyes.

In 2003 I had a 2k Pioneer dvd player and was using the RGB Scart output connected to my Loewe tv.
I tried a few scart cables and especially the Chord brand has a pure silver cable that had a picture with sharper detail and deeper colour than the stock cord by a long shot.

Then with my Xbox there is a brand called IXOS who do a great gold plated Scart cable that has a much better picture than stock cords. Im sure their componant cables would be good as well.
 
Agisthos said:
This is a bit off topic as well but Id like to mention that spending money and getting some good quality cables makes a difference in picure quality.
It only applies to analog interfaces (not applicable to DVI/HDMI)
 
one said:
Agisthos said:
This is a bit off topic as well but Id like to mention that spending money and getting some good quality cables makes a difference in picure quality.
It only applies to analog interfaces (not applicable to DVI/HDMI)

A lot of companies have already started making specialty HDMI cables and claim they make a difference with picture quality.

Now that I would like to see. I know HDMI is purely digital but if there is any truth to that matter it would be interesting.
 
The only potential issue with a digital signal is how it is clocked. If the signal has a clock signal which can be used to decode the bits then all should be fine. If however the clock is meant to be derived from the actual signal, it's possible to introduce errors such as jitter. DVI (from which HDMI is derived) shouldn't suffer from this, but SPDIF does.

I would still maintain that the cable won't play much of a part in that, and it's down to the quality of the components at either end which are encoding and decoding the signal that matter. Audiophiles tend to disagree - but they're a funny bunch...

Obviously with analog, "errors" are a pretty subjective thing, and everything you do to the signal (even placing other electrical items nearby!) will have some kind of effect on it; it's just a matter of opinion as to which improve a picture and which don't. With digital there simply won't be any effect until there is enough degradation to change the bits - at which point the failure will probably be exceedingly obvious, and in most cases catastrophic.

I would only pay more for a cable that was going to be particularly long - a better quality cable will probably be capable of transmitting a signal accurately over a longer distance.
 
Guden Oden said:
It's still rubbish tho that DVI was invented mostly for the purpose of implementing CP... :D

DVI with HDCP was implemented on consumer electronics because of copy protection concerns. There was an alternative, in the form of Firewire with DTCP and Mitsubishi sold a lot of sets with those.

But Hollywood was reportedly not content with Firewire/DTCP because for one thing, DVI carries uncompressed signals which makes it expensive to record.

Few HDTV sets have Firewire now, which is a shame because you can control other devices in a Firewire network.
 
Back
Top