VGA bandwidth conversion?

kyleb

Veteran
I'm looking to understand how eithernet-video extenders transfer high resolution VGA over network cable but I am at a loss to understand how to convert between the mhz of VGA to the mbps of eithernet. For instance, this extender claims 212mhz of bandwidth supporting resolutions up to 1600x1200@85hz; but does anyone know the math behind what that comes out to in mbps over the eithernet connection?
 
The data rate for ethernet (100 mbps) is simply a function of the quality rating of the cable--100 MHz over 100 meters at whatever signal-to-noise ratio (db) the specification requires.

VGA is an analog signal, so the data rate (mbps) doesn't really apply. I don't know for certain what "vga bandwidth" is supposed to be, it sounds like a marketing figure to me, but my guess would be that they've combined the bandwidth required for the signal (about 163 MHz for 1600x1200 at 85Hz) along with the overhead bandwidth needed to compensate for the increased signal-to-noise ratio due to the fact that both the signal frequency (163 MHz) and the distance (130m) are outside normal cat5 specs (100 MHz and 100m respectively). I'm also guessing that if you actually buy that, they will recommend using STP cat5, cat5e, or possibly even cat6 if you really want to come close to their advertised maximum capability.
 
kyleb said:
does anyone know the math behind what that comes out to in mbps over the eithernet connection?
I don't think there is any Ethernet involved at all. I think the Cat5 cable and the RJ45 connector have mislead you. Nowhere in the product's specifications is Ethernet mentioned, with the exception of "ethernet infrastructure" and this should probably be taken as the cabling only; you won't be hooking this thing up to an Ethernet switch or router.

Cat5 is 4 twisted pairs so that should suit RGB + sync nicely. The distance over Ethernet (130 meters versus 100m) can probably be explained by higer tolerances. I don't see any mention of the signal being digitized before transmission between the transmitter and receiver, so I think it's safe to assume it is analog.
 
I'm not looking to buy one of these just trying to understand the science behind it. In respect to that, I suppose VGA wasn't the right format to discuss, and rather a DVI etender like this. So keeping it in the digital relm, can anyone here explain how they get 1080p over a eithernet cable?
 
wireframe said:
I don't see any mention of the signal being digitized before transmission between the transmitter and receiver, so I think it's safe to assume it is analog.
When analog VGA turns into crap when sent through a 13 meter cable with all kinds of shielding (often metal foil around the cable core, alu braid over that, with the primary cables as individual coax cables), 130 meter over standard unshielded TP...? No, I don't think so. :D It most likely has to be digital, or else it'll be analog circuitry from hell keeping signal degradation and interference out... I don't believe it to be possible.
 
Yeah, I was assuming that the VGA extender was converting to digtial for the same reasons. But I'm still crious as to how much bandwidth such a device would use to send a completly uncompressed video? I'd think that it would be as simple as say for 1080p60; 1920x1080x24x60, but that comes out to far more than the what I'd expect to be able to send over any copper network cable, so what gives?
 
Guden Oden said:
When analog VGA turns into crap when sent through a 13 meter cable with all kinds of shielding (often metal foil around the cable core, alu braid over that, with the primary cables as individual coax cables), 130 meter over standard unshielded TP...? No, I don't think so. :D It most likely has to be digital, or else it'll be analog circuitry from hell keeping signal degradation and interference out... I don't believe it to be possible.
Look at what you just wrote. What do you think the problem is, the cable or the transmitter/signal?
 
This looks like this is called "Balun." They convert a unbalanced signal (the analog VGA) into a balanced one (the signal on the Cat5). For us computer people, balanced is the same thing as a differential signal.

The "212mhz of VGA Bandwidth" is just a measure of the dot clock as the VGA signal is being scanned out. Remember the days of video cards boasting about "350mhz RAMDACs?" That's the same number there. I haven't run the numbers in a long time, not since I was using Xfree86 2.x to drive old fixed scan monitors, but you can't work out the numbers without knowing about how the VGA signal is being driven durning horizontal and vertical refresh. 212mhz isn't to much to ask from Cat5 anyways, we get 200mhz out of just half the pairs for 100mbit ethernet. Cat5 is capable of a lot more than most people expect, and for the price of those boxes, they probably have nice chunky amplifiers in them.

I'm running 50 meters of Separate Coax to a 42" LCD @1366x768 with an amplifier, and it looks as good as running locally. It's very possible to make these kind of runs look good if the signal is strong enough. It's just hard to include a lot of signal driver on the actual video card, and often the output stage of PC's is thrown together pretty poorly. I've always used Matrox cards or Quadro's for stuff like this.
 
Back
Top