The difference between dvi(hdmi) and analog connection (component, svideo, composite, vga, etc) is going to be largely dependent on a couple of factors. The first is the throughput of the cable, and the second is the interference that will cause the signal to degrade. Analog connections are also dependent on the quality of the Digital to Analog and possibly Analog to Digital converters in the devices at either end of the cable.
With a digital signal it is pretty much all or nothing. If there is too much interference the signal won't decode. In most cases though, the signal will transmit as a 1-1 representation of what was sent, meaning you have no loss of quality. This method however, restricts you to a hard limitation on how much data can be sent per second, so as you up the resolution or color depth, the rate at which you can send frames decreases.
With analog cables, the quality of the picture is dependent on many factors, it mostly comes down to shielding. First, the signal must be converted from the digital signal to an analog signal inside the computer/gaming device/etc. DACs play a pretty large part in the quality of the image, especially when you are talking about pushing something like 1080p. Noise inside the computer can be a large source of signal degradation, so shielding inside the computer is very important. DACs also tend to be rated a certain MHz rating which basically says how much throughput it can handle (though this doesn't necessarily match how much the videocable can do). Next, the analog signal is sent over the cable to the display device, and again, a lot of it depends on shielding and how much interference there is. Lots of interference and poor shielding will result in a very noticably bad image. Little interference and good shielding might result in a display that is very similar to what you would see with DVI. Also, as you try to push more data over the wire (Higher resolutions/color depths/refresh rates), the quality of the display will decrease. Analog connections have more of a "soft" limit to how much data can be sent. You can certainly try to send more data over an analog connection than the cable is rated for, but the signal will become more and more corrupted as the throughput rate is increased. Finally, if any Analog-to-Digital conversion takes places (for LCDs/Plasmas), the quality of this component will play a role as well.
Just as a side note, some people on here were talking about DVI connections on CRTs. The primary reason to do this is that the CRT manufacturer can match the DAC to the CRT, and remove many of the sources of interference. Essentially, all of the places where an analog signal takes place happens inside the CRT where the manufacturer can control the environment. For a manufacturer that is trying to produce a very high quality picture, this is great because they will no longer be blaimed if the picture is poor due to some low quality DAC in the computer or if a bad cable is used.
In practical terms, for 720p a vga or perhaps a component connection is probably going to be good enough most of the time. For 1080p, I would start to be worried about VGA. BNC cables would be better, but as those are pretty much a dying breed I'd defintely want something like DVI. DRM sucks though. There should be a rule that manufacturers can't encrypt data they don't own.
Nite_Hawk