Component video and 1080p

Endymion

Newcomer
All right, hopefully this is the proper place in this forum for a thread such as this, but here goes.

Recently on another forum someone asked about the quality of 1080p over component video and I chimed in with my experience, which I can state very simply: it works, however there is a faint ghosting to it. I did caveat my statement as being my own experience, appropriately enough. However, I have two plasma TVs that I've seen this issue with, and seen similar issues with at least 5 other setups.

I would also like to make clear that I understand the following already:

1. That 1080p via component is a more recent addition in the past few years,
2. That some HDTVs still sold today do not support 1080p via component,
3. That many-or-most older HDTVs did not support it in years past,
4. That more and more sets are supporting 1080p via component now,
5. And that both the PS3 and XBox 360 have had very functional 1080p via component for quite some time now.

In other words, I get that situations can vary, and that you should make certain that your set in question actually supports 1080p via component from the manufacturer before making a blanket statement. Some consumers will have the opportunity to use 1080p this way, and some won't.

However, after seeing this issue with my own plasma sets, I went to HDMI for the latest consoles and never looked back, mostly due to the issue that I noticed as soon as I attempted 1080p via component. I tried higher quality cables, I tried Monster cables, I tried monoprice cables, I tried brand-x cables, I tried cables that came with OEM equipment. I tried with switchboxes, without switchboxes, through HT equipment, and directly to the TV. I still had a very faint ghosting/blur to the image. This just grated on me, so I left component behind. At 1080i there was no issue, at 720p there was no issue, at 480i/p, there was no issue. Switching a source to 1080p gave this issue. Seeing it on other person's TVs, it only stood to reason from my observation that 1080p has issues with component cables.

As I had solved my problem by ditching component cables for 1080p use however, I didn't really waste a great deal of time pondering over the matter afterwards. Until last week of course, at which point the chorus of chants that perfectly functional component cables, equipment, and TVs should operate a 1080p signal perfectly just started to grate on me as well. So I started trying to google for some solid information on how it is that my experience happened to be as it was.

The best information that I have found so far, is here:

1920x1080 at 60fsp is about 124MHz. It is not the cable itself, but the RCA connectors. RCA connectors were designed for baseband video (~10MHz). HD component video is still in their realm, but 1080p is outside what the connectors are able to do. If you use BNC connectors then no problem. I dont know of any component video devices that use BNC, maybe the professional stuff. That is another thing Monster Cable does not tell you. You can shield that RG6 and sweep test it to 3GHz all you want. When you mate it to crappy RCA connectors then that is your weak link. I wish the video industry would have gone to BNC back a long time ago.

This is intriguing stuff to me, having always been interested in image quality at the screen, and as it turned out, the plasma screens that I have actually have BNC connections for component. The above poster went on:

In 1080i you get 1920x540 for the first set of lines then another 1920x540 for the second set of lines. Those two fields are drawn on the screen in alternating fashion to make one frame. 1080i is 30fps video. 1080p is 60fps. 1080p requires double the bandwidth of 1080i. Remember, the cable is not the limiting factor, the RCA connector is.

Refer to the calculations below:

SD - 640x525x30 = 10.1MHz (only 480 are viewable)
ED - 640x480x60 = 18.4Mhz
HD 720p - 1280x720x60 = 55.3MHz
HD 1080i - 1920x1080x30 = 62.2MHz
HD 1080p - 1920x1080x60 = 124.4Mhz

This all certainly seems plausible. I've dealt with cables for many broadcasts, composite, component, or even audio, of varying quality and quite frequently, when I had issues with a particular cable, it wasn't a crimp in the cable itself but soldering work or frays and such nearer the connector, under its hood. But I wondered just where he got some of this information from, with respect to RCA's bandwidth limitation and so forth. But the big idea behind this information would seem to confirm what I had seen, that the cabling is fine but that the connection is behind the times. After all, it is pretty old.

However, I'm not sure if this final post was meant to invalidate this or, what, as it seems to be worded rather densely.

Just to keep it factual -Since no one else is correcting you - I guess I'll have to jump back in - my post which you quoted explained the reason why the analog signal is half the pixel rate -and not 148.5 Mhz as the #5 post stated ( and I typo'd to 184 in mine) - but just to show I'm not the one confused here's an online calculator http://www.myhometheater.homestead.c...alculator.html that also computes the bandwidth requirement (not the highest frequency)as 186.6 Mhz (3x the highest frequency - see their definition below of bandwidth (a variably defined engineering term) vs highest frequency - (a physical quantity ) -

which means they compute the highest analog frequency used as 62.2 MHZ (half your figure)-as I did - His opinion that the whole video "reproduction chain" must pass 3x this frequency at less than 3db loss to not degrade the signal seems to be from marketing hype - video transmission engineering would IMO specifiy 100Mhz (6 db per octave rolloff) as the bandwidth spec required for adequate PQ

So just when I've stumbled across what seems to be "the answer," I'm not so certain. Does this make sense to the more technical minded here? I can't honestly tell if the second poster is saying "no, because analogue signals are half the pixel rate, component is fine for 1080p," or if he is agreeing with him and tacitly referring to the 1080i figure, and chiding the exotic cable [strike]rip-off artists[/strike] manufacturers for their high bandwidth claims. Somehow just seeing these figures it seems a little surprising that anything beyond 480p works properly without ghosting over component cables. I would sit down and try to work out the math on this myself but I am just not sure where they are getting this from.

Last minute edit: I also understand that AACS limits BluRay to 1080i over component as a matter of content protection but that is also beside this issue, I am just trying to isolate a specific reason why 1080p would ghost/blur at all via component.
 
Uh...why is an RCA connector unable to carry as high a frequency as a BNC connector?

It's not going to be the connector, but the coaxial cable that makes the difference here. BNC connects to coax with an inner copper wire (or bundle), dielectric and and outer shield.
 
Uh...why is an RCA connector unable to carry as high a frequency as a BNC connector?

It's not going to be the connector, but the coaxial cable that makes the difference here. BNC connects to coax with an inner copper wire (or bundle), dielectric and and outer shield.

Well as it happens, that is the question that I am here to ask! Why is it that the cable is more important than the connector? I mean, I'm not being childish turning your answer around, I'm asking a serious question about it.

At any rate, I just don't think that after seeing more than half a dozen setups all of which had the same issue at 1080p+Component, that it is an issue with the source or the screens in question, particularly when the same source & screens operate perfectly at either lower resolutions or with HDMI. It would have to be an issue with the cable in some way, be it the connector or the cable. If RG6 cabling can handle 3GHz, but 1080p requires over 100MHz, and you are seeing ghosting at 1080p, that would seem to indicate an issue with the cable wouldn't you think?
 
Absolutely it could be the cable (but not the connector IMHO).

RCA wires have signal and ground side-by side with no shielding and no quality dielectric. Coaxial cables have signal in the center, a large dielectric and then ground on the outside acting as a shield. What's more if your ground picks up any noise the magnetic component completely cancels at the center of a coaxial wire. If you want to do an experiment, wrap aluminum foil around the RCA wires and connect it to ground (chasis or a sink or something) and if the ghosting lessens you have your answer!
 
The cable's length and core diameter (and shielding) will matter for signal transmission and noise. How long are the cables you are using?
 
If you want to do an experiment, wrap aluminum foil around the RCA wires and connect it to ground (chasis or a sink or something) and if the ghosting lessens you have your answer!

I had thought of ways to try and test for interference before but what about seeing this only at 1080p? Wouldn't interference be a problem at any resolution? To just chalk it down to interference still seems counter to how it is manifesting itself. There is no static, no rolling lines (a la ground loop), just a faint, ever so faint blur, completely aside the image as it should be displayed, maybe only a pixel or two. I can totally understand how some, perhaps not as anal as I am about this, might not see it without being told to look for it first. It was night and day to me, the first day that the 360's 1080p firmware update was available however.

The cable's length and core diameter (and shielding) will matter for signal transmission and noise. How long are the cables you are using?

The cables I use with my own TVs are 3 and 6 feet, from monoprice, coupled through my HT equipment to switch devices so I doubt the length was at issue. I even tried connecting the consoles cables directly, to eliminate either length or the HT, to the same effect. That's just my setups though, I know less about others whose game rooms I've witnessed with the same issue.
 
I had thought of ways to try and test for interference before but what about seeing this only at 1080p? Wouldn't interference be a problem at any resolution?

The higher bandwidth required for the signal to have acceptable S:N ratio would make it a fair bit more sensitive I would think.
 
Noise is typically referenced in terms of nano-volts per root Hz. The larger your frequency band the more noise.
 
You could try twisting your RCA cables (a la twisted pair network cabling) as this reduces noise and interference as well...
 
Interesting I haven't heard of this before. I'm using standard (Microsoft) component cables and don't have this issue. The image on my 46" set looks pretty much identical to using HDMI whether it's 1080p or 1080i.

One of the reasons I'm not in a hurry or worried about replacing my launch X360 with a newer version with HDMI. Although the power savings would be nice.

Then again I don't have many sources for electrical interference near my set.

Regards,
SB
 
I can't tell the diff either on my 50" plasma...but then I only run component from DVR to receiver and then hdmi from there.
 
Interesting I haven't heard of this before. I'm using standard (Microsoft) component cables and don't have this issue. The image on my 46" set looks pretty much identical to using HDMI whether it's 1080p or 1080i.

Yes . . . I even said this as well. It looks pretty much identical. The "pretty much" part would indicate the ghosting.

Then again I don't have many sources for electrical interference near my set.

Nor do I. And mind you, I did mention I have seen this on other sets right? A total of 7, including my two plasmas, have all had this issue with 1080p+Component. I just don't think the likelihood of interference or faulty cabling is even possible by chance to have occurred in every instance I've seen to manifest itself in the same way. Is that much of my reasoning at least making sense to everybody? I can't be the only person to be seeing this.
 
In the professional world BNC cables are always used, if the you have something with the classic RCA/PHONO connector you don´t solder a RCA/PHONO connector on the cable, you use a adapter so you don´t mess up the impedans.
 
Yes . . . I even said this as well. It looks pretty much identical. The "pretty much" part would indicate the ghosting.

Well, when I said, "pretty much" the same, I had meant that I couldn't not tell the difference with my naked eye. I haven't seen the ghosting mentioned, but then I only know of 2 other people than myself that use component cable for 1080p. Everyone else I know uses HDMI.

So I don't doubt that you all are seeing it, I'm just wondering what's causing it.

Regards,
SB
 
Back
Top