philips LCD 26PF5520D/10 HELP!!

leugeboy

Newcomer
Hi,

I have a philips lcd 26" tv and have plugged in the pc using a VGA to SVGA lead into the DVI tv in (using DVI adapter). The picture is superb when games are on etc but not as sharp on the main desktop screen. I am using a reasonable 128 mb nividia card that handles most games well and have the resolution set to 1024x768.

I dont have a DVI out from the pc graphics card. Dont get me wrong, the picture is good and it may be a silly question but are these settings correct and is there a particular reason why the desktop screen is not quite as sharp?

I will eventually get a card with dvi out so I can connect direct to pc. Is there a huge difference to what I have done and how it would be using a straight connection DVI to DVI?

Any advice would be very helpful.

Many thanks. Martin.
 
leugeboy said:
and have plugged in the pc using a VGA to SVGA lead into the DVI tv in (using DVI adapter).
There's not really any such thing as "SVGA", that's just an ancient marketing term from the days when resolutions generally topped out around 800*600 and such. You have an ordinary VGA cable attached to your DVI adapter. :)

The picture is superb when games are on etc but not as sharp on the main desktop screen.
You should have the same sharpness of graphics both in-game and on the desktop, all other factors remaining the same. It's likely you just don't notice any fuzziness because 3D games use antialiasing and texture filtering, which inherently 'fuzz things up' a bit and tend to hide sharp contrasts.

and have the resolution set to 1024x768.
Is this really the native resolution of your LCD panel though? Most panels these days are 1280*1024, so if you run it at a lower resolution the image will be slightly scaled up to fit the available screen area. This leads to a certain amount of fuzzyness.

Since LCD screens have discrete pixels etched into the screen (regular CRT screen monitors do not), your screen rez should ideally match the LCD panel's rez exactly. In-game it doesn't matter as much since fine details like small-print text and such is relatively uncommon so there you could use 1024 still, but on the windows desktop I suggest bumping the rez so it matches the panel's. Assuming this is your problem; some graphics cards simply produce a bad and fuzzy VGA output, and there's nothing to be done about it even if you do run the desktop at the correct rez.

I will eventually get a card with dvi out so I can connect direct to pc. Is there a huge difference to what I have done and how it would be using a straight connection DVI to DVI?
Typically the image difference should not be tremendously huge, unless you either have a bad VGA output, a bad VGA cable, or just a relly LONG VGA cable (as long cables are fairly catastrophic for VGA image quality, particulary those who have been lengthened using gender changers or such).

DVI is simply a more modern and generally better interface to use with digital flat panels, it removes the unneccessary digital to analog and analog to digital conversion steps that run the risk of reducing quality, and cable lengths can be increased - within reason, though there are fully working DVI cables up to 15 meters in length.
 
okay... I have same TV ;)

(Guden Oden: it's native resolution is 1366x768, which you can actually get working with right graphics card, powerstrip and wayy too much time on your hands to do it.)

Leugeboy, 1024x768 is correct resolution for that tv. image isn't sharp mainly for two reasons:

A) you are using VGA cable instead of DVI. the tv has to make Analog to digital conversion, because LCD shows only digital signal, while VGA cable has only analog signaling. This makes the image blur quite bit.

B) if you are using "Full Screen" aspect ratio instead of "Normal", the image is streched horizontally to full fill the screen. This means that pixels aren't squares anymore. if you want play in wide screen, you have to use resolutions that are wider. if I recal right, VGA connector shows resolution of 1360x768, which is pretty close to native resolution. For this you can use "Full Screen" aspect ratio. for normal PC resolutions (640x480,800x600,1024x768, etc.) use "Normal" aspect ratio. This ensure that images are displayed right.

DVI will help you quite bit. Sharpness is in quite different level.

For buying graphics card, I'd recomend you nVidia based cards, because they have been proven able to support 1366x768 resolution. (though it's beyond hard work to get it working with powerstrip.) For ATI, it seems that 1366x768 is impossible because ATI chips want resolutions to be multiply of 8.
 
Back
Top