LCD - good enough for gaming?

Status
Not open for further replies.
DiGuru said:
And not only that, but the picture on an LCD isn't only shown on refreshes. It only changes when refreshed. It's there all the time. That's why you have no annoying flicker.

if you think a CRT at 100Hz flickers.. then you would have to worry about flicker caused by the backlight on LCD.
I don't think the issue is relevant in both cases :)
 
Blazkowicz_ said:
if you think a CRT at 100Hz flickers.. then you would have to worry about flicker caused by the backlight on LCD.
I don't think the issue is relevant in both cases :)
No, it's just that there is be no visible reason to have a higher refresh on LCD's.
 
DiGuru said:
No, it's just that there is be no visible reason to have a higher refresh on LCD's.
Yes there is: temporal aliasing. The lower the frame rate, the jerkier displayed movement is.
 
Last edited by a moderator:
Bolloxoid said:
Yes there is: temporal aliasing. The lower the frame rate, the jerkier displayed movement is.
and you think at 60 fps there will be jerkyness?
 
Hubert said:
First: do I want a 25 ms 170/170 panel with good contrast and 8 bit color depth, or do I want a 4-8 ms panel with the usual 150/135 view angles, 6 bit color depth ? Also there is the resoution question; I will use the monitor to work on, so a 1440 x 900 wide 19" LCD is tempting. But I also play games, and 1440x 900 is just not a standard resolution.

unless you spend more time gaming than in your desktop you definitely want the wider viewangles and higher color fidelity.

a 170 degree graded monitor usually has a sweet spot (as in 'you see colors near the intended ones') within some angle that hardly exceeds 35-40 degrees, especially vertically. the rest of that advertised view angle is totally off the mark (as in 'yes, you can distinguish what's on screen but that's as much as you get'). there simply does not exist lcd technology that produces 170-degree viewing angle correct color reproduction. and i doubt it will ever exist.

color fidelity: unless you never intend to watch video footage on your monitor, you'd want to get decent color fidelity. 6 bits just don't cut it for movies and other non-synthesized material. you'll be getting tons of false colors in the low luma. and that sucks.
 
Very true- if the age of CRT's had extended a bit longer to support a dvi connection (digital all the way to the last output stage), that would have revealed a final layer of image quality that we would probably have found to be quite comparable to LCD performance. Essentially, it would have removed the analog VGA video cable from the chain (plus some analog electronics inside the monitor).

Per my observations of in-store demo units, I am unconvinced any of the high-speed LCD models are really getting close to eliminating ghosting, to the degree that CRT's possess. What it comes down to is some models have less ghosting than other models, but still noticeable ghosting nonetheless. If you find a 21 ms model somewhere, the ghosting is very pronounced. So if you are upgrading from one of those models, it is easy to fall into the perception that faster models are close to eliminating ghosting altogether. However, on an absolute level, it is still ghosting pretty easily- it's just a few notches less than the slower models, though.
 
To be fair, isn't some of that effect the result of a visual aid that Windows provides when rendering the mouse pointer? I don't doubt that rendering frequency will have a bearing on "pointer trails", but it isn't exactly clear how much comes by design and how much comes as a consequence of refresh speed.
 
Bolloxoid said:
Plenty. Just wiggle your mouse a bit, and you can see the mouse cursor in multiple discrete positions.
Er using that silly test you can see the same thing on a CRT.
Also on crts there is a such thing as ghosting for crts, well kinda like gosting
Like I notice that on my HP branded 19" when a bright object moving around a dark place results in a small, but visable trail.
Like moving a mouse pointer of a dark backround results in a white trail.
 
radeonic2 said:
Er using that silly test you can see the same thing on a CRT.
Also on crts there is a such thing as ghosting for crts, well kinda like gosting
Like I notice that on my HP branded 19" when a bright object moving around a dark place results in a small, but visable trail.
Like moving a mouse pointer of a dark backround results in a white trail.
This is because of the way the eye works. The rods and cones do not respond immediately to changes, much like a phosphor.
 
ANova said:
This is because of the way the eye works. The rods and cones do not respond immediately to changes, much like a phosphor.
Well I can see trails, not motion blur.
 
Blazkowicz_ said:
if you think a CRT at 100Hz flickers.. then you would have to worry about flicker caused by the backlight on LCD.
I don't think the issue is relevant in both cases :)

believe it or not it does

after about 3-4 years in front of the CRT for most time (5-6 hours at work + 3-4 at home on average) even 100Hz is not enough when tired. I could even got twitching in my eye because of flicker :oops:, not to mention headaches, LCD however helps tremendeously, even if it is only for those 4 hours at home.
 
Druga Runda said:
believe it or not it does

after about 3-4 years in front of the CRT for most time (5-6 hours at work + 3-4 at home on average) even 100Hz is not enough when tired. I could even got twitching in my eye because of flicker :oops:, not to mention headaches, LCD however helps tremendeously, even if it is only for those 4 hours at home.
Can you see lights flicker :???:
 
radeonic2 said:
Er using that silly test you can see the same thing on a CRT.
Of course you can. Temporal aliasing is exhibited on any device that draws discrete frames at a rate insufficient to represent fast movement smoothly. The point is that temporal aliasing is worse at 60 Hz than, say, 100 Hz. You are the one being silly here.
Like I notice that on my HP branded 19" when a bright object moving around a dark place results in a small, but visable trail.
Any bright object moving around in a dark place will cause a trail because of temporal summation in the visual system. Slow phosphor decay may be a factor too, but to a smaller degree.
 
Bolloxoid said:
How is a low refresh rate related to the ability to use vsync?
Because it means you are more likely to be able to enable vsync without halving your frame-rate. If you have a 85Hz refresh rate but your card is only averaging 70 FPS then you are going to spend most of the game at 40 FPS if you enable vsync. However, with a refresh rate of 60Hz then you'd get a constant 60 FPS.

IMO it's better to have a constant, steady frame-rate without tearing than a massive average frame-rate with tearing which is constantly jumping around.
 
randycat99 said:
Very true- if the age of CRT's had extended a bit longer to support a dvi connection (digital all the way to the last output stage), that would have revealed a final layer of image quality that we would probably have found to be quite comparable to LCD performance. Essentially, it would have removed the analog VGA video cable from the chain (plus some analog electronics inside the monitor).
Why would it remove analogue electronics inside the monitor? Woudln't you have to convert the digital signal to analogue in order to drive the CRT and thus need the same components post D/A conversion as any analogue CRT? Are you saying a CRT electron gun can be driven digitally?
 
radeonic2 said:
Can you see lights flicker :???:

Dunno is this how you describe flicker, but lower the refresh rate on a CRT to 60 hz, or plug in your PC in a TV @ 50 hz, and even a "normal" person should be able to see it. I can see it straight no problem at those rates, but when I am tired I feel the effects of flicker even at high refresh rates, ie headache, twitching nerve in my eye... hard to believe actually this twiching really suprised and worried me at the same time, but less exposure to CRT makes it go away, (well a good sign I really need to do something else) I assume this is only due to really too much time I spend in front of monitors, and I guess I might be a wee bit sensitive :)...
 
Zod said:
Why would it remove analogue electronics inside the monitor? Woudln't you have to convert the digital signal to analogue in order to drive the CRT and thus need the same components post D/A conversion as any analogue CRT? Are you saying a CRT electron gun can be driven digitally?
It would remove some analogue electronics. Of course you need a DAC at the end, but the transmission losses can be virtually eliminated.
 
randycat99 said:
Very true- if the age of CRT's had extended a bit longer to support a dvi connection (digital all the way to the last output stage), that would have revealed a final layer of image quality that we would probably have found to be quite comparable to LCD performance. Essentially, it would have removed the analog VGA video cable from the chain (plus some analog electronics inside the monitor).

Per my observations of in-store demo units, I am unconvinced any of the high-speed LCD models are really getting close to eliminating ghosting, to the degree that CRT's possess. What it comes down to is some models have less ghosting than other models, but still noticeable ghosting nonetheless. If you find a 21 ms model somewhere, the ghosting is very pronounced. So if you are upgrading from one of those models, it is easy to fall into the perception that faster models are close to eliminating ghosting altogether. However, on an absolute level, it is still ghosting pretty easily- it's just a few notches less than the slower models, though.

Image quality is not just about ghosting. CRTs have their own set of issues like convergence, linearity, geometry etc. In summary, the best available LCDs destroy the best available CRTs in almost all categories. Now as far as CRTs with DVI connections, those have been around for a couple of years already, but they still don't solve the issues fundamental to CRTs I mention above.
 
Last edited by a moderator:
Xmas said:
It would remove some analogue electronics. Of course you need a DAC at the end, but the transmission losses can be virtually eliminated.

and a major downside is that for good, bigger than 17" CRT you would need dual link DVI.
 
Status
Not open for further replies.
Back
Top