By the way how does VGA input on a PC monitor compare to HDMI input? Is the difference really striking?
The only possible answer is really that "it depends".
VGA is in of itself a fairly high-frequency analog signal and therefore more susceptible to degradation and interference than digital signals, especially high res video at high refresh rates across long cable lengths. Now, the video resolution and rates used for consoles aren't that fast, and the cables aren't that long. Still, the analog filter components in the signal path might affect sharpness in the final image just as is the case on many PC graphics cards where manufacturers cut some corners to reduce costs, leading to fine details seeing a slight blurring. This is usually only pronounced in higher resolutions and/or refresh rates than those used for 720P HDTV however, so this should not be a problem. Also, the coarser dot pitch of a CRT TV compared to a PC monitor should completely hide any minor blurriness in the signal anyway...
A more serious problem however is if the analog VGA video is being shown on a discrete pixel device (LCD, DLP or plasma screen for example) and the TV's analog to digital converter can't lock on precisely to the pixel clock of the incoming video. As VGA lacks a pixel clock signal lead in the cable (because high-res color LCDs and such being unheard of at the time VGA was invented; CRT monitors don't need this), the ADC is forced to do a "best guess", trying to find the signal peaks so that a video pixel can be mapped precisely to a screen pixel. Sometimes this process can fail, leading to the image "crawling around" or becoming blurred.
If the display device is a discrete pixel device that does 1:1 pixel mapping of the DVI/HDMI input (such as a 720P flatscreen LCD or plasma TV being fed a 720P signal), then that will be the "purest" image you can get. Of course, all LCDs or plasma screens aren't born equal; some will have better brightness, contrast, color reproduction than others.
So again, it depends.
If the TV rescales the image, then the result might not be as ideal, it depends on the scaler chip used and how the image is scaled. For example, if the screen isn't natively 720P but rather say, 768 lines vertically, and maybe even 1330-whatever pixels across instead of 1280 as is the standard. Some TVs may be native 720P devices but actually UPscale the image slightly so that the edges of the video signal literally extends off the screen and won't be displayed... This is of course not good at all.
So in the end, we get down to this: All things being equal, VGA and HDMI (DVI) should produce a comparable result. But, it depends.
I would personally choose HDMI over VGA if I had a flatscreen monitor, if it's a CRT it probably doesn't matter all that much. Pick whichever gives you the best result.