LCD images delayed compared to CRT

I read about this phenomenon for the first time yesterday when I ordered a Dell 2407WFP. I plan to use it for gaming, so I hope it's not too noticeable.
 
They should run the same test on a set of HDTV's, and it's much better to just have a count on the screen, that way you can measure the exact latency.

I've measured some HDTV's at 9+ fields of latency, although most do better through the "Video Game" input.
 
Not to nitpick, but this could be more about clone mode (vga vs dvi also) than the monitors themselves.. :)

To test that, use two crts instead and see if the same problem(?) happens?
 
The only reason that image looks better on the Dell there is because the fool is testing it against a 19" el cheapo Mitsubishi screen (NOT one of their good models), with the colour temp clearly above 9000 (stupid, stupid, stupid), and the contrast clearly up by a margin.

Set the colour temp to 6500+ (7000+ max), down the contrast slightly, and the image would be pretty darn spectacular in comparison. Better yet, why not just use a bloody 22" Trinitron, do a factory reset and set it to gaming mode in the OSD.

"Where does this delay come from? It’s hard to tell for now and we will have to investigate a little more. "

There's a image presentation delay on LCD's? No way! I've never heard that before!......
 
Last edited by a moderator:
Pretty pointless article, IMO. I thought such topics had been covered long ago.

As for the color temperature, I dont know about you Sobek, but I personlly find 9000+ to look better than 6500. 9000 looks much better, IMO.
 
Skrying said:
Pretty pointless article, IMO. I thought such topics had been covered long ago.

As for the color temperature, I dont know about you Sobek, but I personlly find 9000+ to look better than 6500. 9000 looks much better, IMO.
6500 is much more natural ;)
But I suppose not all prefer natural color.
http://en.wikipedia.org/wiki/Color_temperature.
General computer-users should set their PC monitor color-temperature to "sRGB" or "6500K", as this is what digital cameras, web graphics, and DVDs etc are normally designed for. Indeed the sRGB standard stipulates (among other things) a 6500K display whitepoint.
 
Skrying said:
Pretty pointless article, IMO. I thought such topics had been covered long ago.

As for the color temperature, I dont know about you Sobek, but I personlly find 9000+ to look better than 6500. 9000 looks much better, IMO.

I know where you're coming from...i've got a friend or two who swear that 6500+ looks too reddish and such, and that 9000+ is nice and 'neutral white'. I basically always stuck to the stock 9000+, and it was great, but one day when I was a lot younger I was fiddling with my old 17" Sony's OSD and found the Colour Temp control...tried the preset of 6500 and went "BLUERGH" (or something similar :p). But I eventually tried it again and found it to be an absolute godsend for many games like Vietcong 1, GRAW, and Battlefront 1/2. I do believe that some games just look naturally better on 9000+ or higher (Such as Doom 3, Quake 4, and other decidedly dark games). I've always found 9000+ to look very blue...

I'm lucky enough to be able to select as high as 11000+ on my current Trinitron, and having palyed with 10000+ in many games like BF2 and Prey, I can safely say that I do quite enjoy this setting. It's not blue like 9000+, it's not reddish like 6500+, it's just...white. Blacks are really black, and colours aren't as washed out as 9000.

To each his own..but all I know is my monitor and a few others call 6500+ "Gaming Mode" :p

*edit*
radeonic2 said:
6500 is much more natural ;)
- Amen to that.
 
Last edited by a moderator:
10000+ looks awesome, I've seen it a few times before not during games though.

Frankly 9300 I have my monitor at now looks much more neutral than 6500. 6500 has a very red (more brown) color to it. I guess this is why I dont like the coloring on the majority of DVDs and TV shows. Just look off to me. I wonder if that also has a connection to me wanting to break every incandescent lightbulb I see.
 
Skrying said:
10000+ looks awesome, I've seen it a few times before not during games though.

Frankly 9300 I have my monitor at now looks much more neutral than 6500. 6500 has a very red (more brown) color to it. I guess this is why I dont like the coloring on the majority of DVDs and TV shows. Just look off to me. I wonder if that also has a connection to me wanting to break every incandescent lightbulb I see.

Ahahaha, I know the feeling. My cabin always used to look so brown/orange and I merely assumed it was because of my 60w bulbs. Switching to 100w's added plenty more light, but still that weubly orange glow...so I switched to those rediculously expensive power-savers (pure-white) and my god, it was so...clinical :p I just loved the change. That would be once situation where neutral white really takes the cake over 'natural'.

As for 10000+, standard desktop use doesn't really strike me as any better than 6500/9200, but blacks really are quite black (they're a tad brown at 6500, but nothing too noticable). Ingame though, it is pretty good...in BF2 for instance, where a corner of a building is normally cast in shadow and hard to see into, this setting gives it a very clear look...not invisi-black, not browned, but just...well...neutral :p I find myself liking this setting.

To me things like this prove the versatility of CRT's over LCD's...you don't get this much control on an LCD, do you? I swear the only reason people seem to love LCD's are their size, and the praise they get seems more like mere justification for ludicrous amounts of spent money (most of the time)...

*edit* I installed a different set of drivers for my Trinitron, namely, the G520-R drivers (i've never heard of a -R revision before, but it's there..) And my OSD radically changed, enabling me to set up to 15000 on my colour temp. It definitely works too, but I do think it's a bit washed out at that setting, everything looks VERY white/blue wish-washy. I've also got some more controls in the Expert Colour Settings tab, and 3 extra Voltage control options. Pretty cool.

What was this thread about again? :p
 
Last edited by a moderator:
If you've been using 9300k and go to 6500k things will look too red initially, but after awhile it will start looking neutral. At that point, switching to 9300k again will look too blue. It's simply a matter of adjusting to the setting. Personally I prefer 9300k, though I can understand why 6500k would probably be a better choice considering the type of work I normally do.
 
With 6500 the colours are more vibrant.
I got used to 9300 because it feels more comfortable and calm.
 
The perception of color on computer screens depends on the ambient lighting conditions. While you can say a certain color temperature looks right for you if the monitor is the only source of light (which might be the way some people play games/watch movies), this can completely change as soon as you switch the light on or open the window blinds.
 
Back
Top