kyleb said:
Thing is, that is the case with many TVs; they only support a limiited selection of digital signals, while analog signal support is generally not so limited.
I know this will sound weird to some, but i've always though that the main difference between analog and digital (and maybe an advantage of analog in some cases) is that analog more or less works always, unless it's broken obviously. It just works, you might get a crap signal sometimes, but at least it works. You still have the "in-between"'s from best signal to really crap signal. Much like analog broadcasts: sometimes you don't get the best signal, but at least you always get a signal, more or less watchable.
With digital, it's either on or off. You don't have "in-between"'s. It's 1's and 0's. It's either on or off, with no grey shades in between.
So i keep that kind of idea/view all the time, when comparing analog to digital.
In this case, DVI has to comply to a set list of resolutions and refresh rates. If you get one slightly wrong, you don't get an image.
With analog sources like component or VGA, you have more space to play around, so to speak, and if something goes wrong, you get overscan, you get a crap image, but you still get an image which, depending on the instance, might be acceptable or totally unwatchable. The main problem will be the TV itself which might just not accept some signals even on analog, but that's just the TV internal circuitry.
Disclaimer:
I reserve the right to deny responsibility for all of the above, it's 5pm, i've been working all day and it's time to go home, so i might be just talking out of my ass... Don't even know what thread this is!! Oh dear...