Best gaming CRT out there

There is another important thing: with CRT monitors, the pixels are only "on" for a very short time during each refresh. Their luminance varies greatly during each frame, from much too bright, to black. While with LCD monitors, the pixels are always on. Their luminance is pretty much constant during each frame.
 
Typical? Maybe your typical, but I promise that out of all CRT monitors in the last 15 years the average refresh rate was probably 60Hz. Including millions of gamers in that to. How do people bear it? I'm not sure myself, but I can count the number of times on one hand that I've seen someone besides myself using a refresh rate over 75Hz and then double that for anything besides 60Hz.

I hope I understand what you're saying correctly, but if 60hz was the average rr, then a lot of people you know were using interlaced "yucky" mode at 43hz or something like that.

The last monitor I used at 60hz was part of this original IBM PS/1. It had a glorious 12" VGA monitor that boasted a max 640x480 resolution at 60hz. After that I had a 14", a 15" and I still have a 17", all belonging to the cheapest price brackets available and all of them could do 1024x768 at least at 75hz with the last two working at 85hz flicker free with no problems. I personally prefer to use the 17" at 1280x960@75hz for a bigger desktop and it would be great for games with AA if I had a video card that could do it. :)

Believe me, 60hz would be useless for me. I used to live next to a power station and under its cable towers, if you could see the picture jumping around like it did under 85hz you would understand. ;)
 
Eizo T966:)

But I would love to see how an Eizo ColorGraphic CE210W would perform with the 14 bits tech. (no banding).
 
Your brain interpolates frames (well actually, your retina splits up frames into 7 distinct "information distinct" frames anyways, and vector motion is only 1 component, but I digress) anyways, so unless redraw rate is severely low, you should not have much problem.
That's nonsense.
 
I hope I understand what you're saying correctly, but if 60hz was the average rr, then a lot of people you know were using interlaced "yucky" mode at 43hz or something like that.

43Hz would be..... very much earlier than the last 10 years. Unless someone changes the monitors default refresh rate then its going to be 60Hz, I don't recall in the last 10 years seeing a CRT that did not default to 60Hz. 15 years? Maybe a few below that. Most people simply do not alter the settings of the monitor and never go beyond changing the resolution maybe for video settings go. I've often times seen people knock resolutions down from default because they found the test to small.
 
My 19" trinitron does 85hz at 16x12 which is my preferred res.
I think it does 100hz at 12x10 too, if you like that.

Should be cheap to pick up one in good condition.

It weighs 4000 lbs though :p
 
43Hz would be..... very much earlier than the last 10 years. Unless someone changes the monitors default refresh rate then its going to be 60Hz, I don't recall in the last 10 years seeing a CRT that did not default to 60Hz. 15 years? Maybe a few below that. Most people simply do not alter the settings of the monitor and never go beyond changing the resolution maybe for video settings go. I've often times seen people knock resolutions down from default because they found the test to small.
To achieve 1280x1024 in one of my older monitors I had to use interlaced mode and a 43.5Hz refresh rate. Remember: we're talking interlaced in here.

If you mean that the default RR was 60hz for most monitors I would agree [even though it's a responsability of the OS], but you said it was the average, not the default. For it to be the average, someone would have to use less, hence my 43hz reference.

They used to be a pretty common way to get to higher resolutions without buying a new monitor, but then again, it was really yucky. Here are some common video modes available when monitors weren't so powerful, note specially the ones marked as 'interlaced'.
 
To achieve 1280x1024 in one of my older monitors I had to use interlaced mode and a 43.5Hz refresh rate. Remember: we're talking interlaced in here.

If you mean that the default RR was 60hz for most monitors I would agree [even though it's a responsability of the OS], but you said it was the average, not the default. For it to be the average, someone would have to use less, hence my 43hz reference.

They used to be a pretty common way to get to higher resolutions without buying a new monitor, but then again, it was really yucky. Here are some common video modes available when monitors weren't so powerful, note specially the ones marked as 'interlaced'.

Average, as in the must used, as in people who buy Dells, HPs, etc never change it.
 
I used 1024*768 85hz on a trinitron that came with a Gateway, only reason I have a viewsonic vx922 now is because I thought I could get rid of the mains hum but its the same with the LCD if u get close enough. It could do 1280*1024 @75hz and 1600*1200@65hz.
 
mine does those two latter resolutions as well. so your monitor probably supported what I am using as well : 1024 100Hz, 1152 (or 1200x900 as a custom res) 85Hz, 800x600 120Hz and 640x480 150Hz (which I now use for some stereoscopic gaming, will try to find some movies)
 
Back
Top