DVI vs. Component - Image Quality?

I've read that DVI and HDMI were mainly created to enable greater copyright protection for publishers, and that there's not much diference between actual visual quality.

Is this true?
 
scooby_dooby said:
I've read that DVI and HDMI were mainly created to enable greater copyright protection for publishers, and that there's not much diference between actual visual quality.

Is this true?

Well, depending on who you talk to, the difference can be quite large, but it seems the vast majority of users can't really tell.

One thing to remember is that displays might have a "preference" for either input, so on one TV, the HDMI might look much better than the compnent input, simply because the TV itself favours that one... (much like some old TVs being crap with S-Video)

A couple of Samsung displays here in the UK have noise with HDMI input, for example, for some strange reason. Guess we get the duds here in the UK anyway so i'm not surprised.

Theoretically speaking, the all-digital nature of HDMI/DVI should give the best results.

There should be no difference whatsoever between DVI and HDMI signals.

Hope this helps.
 
scooby_dooby said:
I've read that DVI and HDMI were mainly created to enable greater copyright protection for publishers, and that there's not much diference between actual visual quality.

Is this true?

Not quite... DVI and HDMI are mainly just digital interconnects for your TV. DVI itself has nothing to do with copy protection, however most *TV* implementations incorporate HDCP as well. HDCP is built into the HDMI spec.

Theoretically speaking, the all-digital nature of HDMI/DVI should give the best results.

For DVD type content the main benefit is mainly a more stable signal. Since DVDs are basically YUV content you only really incur the YUV->RGB conversion once, either on the TV or on the DVD player. In the case of the latter it's usually because the DVD player is doing some processing on the video stream in RGB color space (e.g. many DVD players with DVI or HDMI do scaling only on the digital ports and are likely doing their processing in RGB).

For consoles, the content is largely RGB, so the digital interfaces are definitely an improvement as currently games are typically converted from RGB->YUV, then sent over composite/S-Video/component, then converted back again to RGB for display on the TV...
 
scooby_dooby said:
I've read that DVI and HDMI were mainly created to enable greater copyright protection for publishers, and that there's not much diference between actual visual quality.

Is this true?
No, and I'd like to know where you read rubbish like that.

First off, DVI has no copyright protection measures whatsoever, and the reason it came into being was to create a standard for connecting PCs to high-resolution digital flatpanels - something that had been sorely lacking before that. While HDMI does have copy protection crap, that's hardly the only reason it was invented. For starters, the cable carries audio also, and it is generally more suitable for home theatre use. The connector is MUCH smaller, and it has no thumb screws. If it's pulled out accidentally it won't damage the hardware. Besides, no normal person is going to notice the copyright crap with HDMI anyway, it will only be a problem for those who want to make digital copies by connecting the video-out of their DVD player to some kind of recording device... As long as the video goes to a screen or projector of some sort (like in at least 99.999% of all cases) it might just as well not have been encrypted at all.

And yes, there is a quality difference too. With analog connections the signal always degrade more the higher frequency it is, this leads to ugly stuff like ghosting, fuzzy image, color bleed etc. Digital signals are generally more robust to interference, and this is particularly true for twisted pair low-voltage differential signals as used in DVI.

PC consumer videocards are all capable of outputting very high pixel clocks, but the analog filters that support such high resolutions and refresh rates generally tend to be sorely lacking. Even at fairly moderate clocks (say up to around 1600*1200 @75Hz, which is no more than around 180MHz pixel clock; far from the 350MHz limit on today's hardware), fuzzyness is easily visible in the display of many brands' products. Nvidia boards in particular have been infamous in this regard.
 
One thing to remember is that displays might have a "preference" for either input, so on one TV, the HDMI might look much better than the compnent input, simply because the TV itself favours that one... (much like some old TVs being crap with S-Video)

I know on my TV that there is a huge difference between the quality of component and svideo(though the quality of the svideo cable helps, I had one with good sharpness but too much color bleeding, then another with perfect colors but crap sharpness, maybe if I had bought a monster cable I'd have both). Unfortunately, component on it kind of sucks now, for some reason it gained a green tint to it that really sucks.(and it's not the cables because it happens with two sets of component cables I've tried)
 
It seems obvious there would be less room for error in sending a straight digtial signal rather than converting it to analog at the source and then back to digital at the display. However, assuming the conversion is up to snuff on both ends, DVI wouldn't necessarily give you noticeably better image quality than component.
 
ninelven said:
Wrong. There is indeed DVI w/ HDCP.
It's not an announced feature on any videocard I ever heard of, so I guess you'll ave to excuse my ignorance... :rolleyes:

It's still rubbish tho that DVI was invented mostly for the purpose of implementing CP... :D
 
In practice, with good components and cables there is no difference in quality between component and DVI/HDMI signals.

See http://www.audiolofftreport.com/dvi_hdmi.html for some of the problems of both connection technologies.

Today all HDTV sets support component inputs. In the future all HDTV sets will probably support HDMI. However, it's not clear when (if ever) HDTV sets will drop support for component inputs. Just look at the PC monitor market - VGA and DVI inputs coexist on most PC LCD monitors.

One interesting edge case where analog technology gives you a better result than digital technology is when the sender and receiver of the signal can support slightly more data per second than the digital spec. So for example I could have 12 bit DACs on my video card, and drive an analog CRT, and end up with a better picture using analog technology than I could with digitial technology. (Because digital technology limits you to 8 bits per color channel. You could use a 10 bit per color channel digital technology, but it doesn't currently exist.)
 
There's not much difference at low resolutions, but as you start upping the resolution, the differences become more and more pronounced.
 
shaderguy said:
See http://www.audiolofftreport.com/dvi_hdmi.html for some of the problems of both connection technologies.

Nice article. Selected quotes:

"any device with DVI or HDMI outputs must carry a special digital anti-piracy code called HDCP (High-bandwidth Digital Content Protection) to thwart illegal copying. HDCP is mandated for both DVI and HDMI connections and uses an authentication protocol developed by computer chip-maker Intel. It involves the HD digital cable box or HD satellite receiver sending a kind of digital “hand-shakeâ€￾ to the HD receiving device (the TV display) to ensure that it's licensed to receive the HD content. These anti-copy codes are embedded in the digital video data stream and must be removed by the video processor in the HD display. Result? The display's internal video processor has to do extra digital “workâ€￾ with no benefits to picture quality, so possible degradation may take place. Using the analog component video outputs bypasses the anti-piracy codes, so the TV display doesn't have to deal with the extra processing. "

"when the digital video signal reaches the HD display in its native format, the digital TV still has to re-clock the digital video so that it exactly matches the TV's native resolution. And according to informed sources, re-clocking digital video may produce more picture degradation and video artifacts -- not less--than a good analog component-video connection"
 
scooby_dooby said:
shaderguy said:
See http://www.audiolofftreport.com/dvi_hdmi.html for some of the problems of both connection technologies.

Nice article. Selected quotes:

"any device with DVI or HDMI outputs must carry a special digital anti-piracy code called HDCP (High-bandwidth Digital Content Protection) to thwart illegal copying. HDCP is mandated for both DVI and HDMI connections and uses an authentication protocol developed by computer chip-maker Intel. It involves the HD digital cable box or HD satellite receiver sending a kind of digital “hand-shakeâ€￾ to the HD receiving device (the TV display) to ensure that it's licensed to receive the HD content. These anti-copy codes are embedded in the digital video data stream and must be removed by the video processor in the HD display. Result? The display's internal video processor has to do extra digital “workâ€￾ with no benefits to picture quality, so possible degradation may take place. Using the analog component video outputs bypasses the anti-piracy codes, so the TV display doesn't have to deal with the extra processing. "

"when the digital video signal reaches the HD display in its native format, the digital TV still has to re-clock the digital video so that it exactly matches the TV's native resolution. And according to informed sources, re-clocking digital video may produce more picture degradation and video artifacts -- not less--than a good analog component-video connection"

Reclocking? What if the signal is output at whatever the TV uses natively, it sounds like they're talking about scaling the image.(which I'd assume would still happen with an analog signal, unless they're talking about a CRT set, but for an LCD screen I'd imagine DVI would be superior)
 
When I was researching my new HDTV, I read alot about the image quality between the DVI & Component. Apparently there really is little to no difference between the two visually.

However, as orfanotna mentioned, the more you increase resolution the more signal you lose. From what I've read, component is good up until you increase resolution past 720p. Once you get past that point you lose noticeable visual quality. Before then no one can really tell.

I ended up getting a 34" HDTV widescreen CRT (great viewing angle, cheap, no ghosting). It's got Component, DVI, & HDMI. Currently I'm using the components and it looks great. Can't wait for a DVD player with HDMI/DVI though. [/code]
 
Half the people in this thread are talking about the difference between DVI and HDMI (There is no diference its essentially the same connection with a different pinout)

and the other half the difference between HDMI and Componant. There is a big difference between the later.
HDMI is an all digital connection from the laser to the tv output. While componant relies on a Digital To Analog conversion (DAC) that degrades the quality of the signal.
The silly thing is that componant converts the signal to analog, and the tv then converts the same signal back into digital again for diplaying on Plasmas and LCD's.

But then it gets more confusing in that some cheaper CRT's have a DVI/HDMI input, but that signal is actually converted to analog internally thus negating the benefits of the all digital connection
(and thus you will not see much difference in the picture between HDMI and componant on those sets)

Go read the reviews of all digital televisions (Plasma and LCD) and they will all say how much better the HDMI looks over componant.
 
I am just focusing on the title of the thread: DVI vs Component. Scooby doesn't help much by creating a thread title that is completely different from his initial question.

Back on topic: On HDTV CRTs & RPTVs, you will not be able to tell the difference between DVI/HDMI/Component, because the screen itself is an analog screen. (personal experience)

For LCD/Plasma, you might be able to tell, but not until you get past 720p (ie 1080p).
 
Karma Police said:
I am just focusing on the title of the thread: DVI vs Component. Scooby doesn't help much by creating a thread title that is completely different from his initial question.

I though it was pretty clear.

Q: Is the actual image quality any better on DVI/HDMI vs. Component.

I didn't put HDMI in the title because it wasn't necessary since HDMI and DVI have the same quality. I was concerned how that IQ(digital) compared to analogue IQ. I was not concerned at all with comparing HDMI to DVI.

From the sounds of it I'm not worried aobut the lack of DVI or HDMI output's on the X360, I'll be getting a 1080i CRT screen, so good quality components should be perfect.

Thx for all the great info guys.
 
On my 720p HDTVs, I can tell the difference. Component signals yield noise, especially noticable around text or monochromatic scenes. The longer the cable, the worse the effect. I bought horrendenously expensive component cables to fix this, since I run cable over long distances. However, in the end, I replaced everything with cat6 cable, and used a box which sends DVI data over cat6. Cat6 is way cheaper than DVI, and you can distribute it through your whole house without much signal loss.
 
Back
Top