Crts: dotpitch vs resolution

Moloch

God of Wicked Games
Veteran
Is there a way to figure out how many pixels a crt is physically capable of producing depending on the dotpitch of it?
 
Is there a way to figure out how many pixels a crt is physically capable of producing depending on the dotpitch of it?
I think there are a large number of factors but, as a starting point, you could try reading Glassner's "Principles of Digital Image Synthesis" which has a good section on the behaviour of CRTs.

Certainly, you have to have a reasonable number of dots per pixel or you'll get aliasing in the reconstruction of the image!
 
how do you figure? My 21" Trinitron has a dot pitch of ~.24mm and has a viewable area of ~430mm wide and ~323mm tall; that comes out to right around 1792x1344 dots, and running at that resolution doesn't aliasing issues. The display supports inputs up 2058x1536 as well, but even with that being notable less than one dot per pixel it doesn't cause aliasing, it just looks a bit blurrier than lower resolutions.
 
how do you figure? My 21" Trinitron has a dot pitch of ~.24mm and has a viewable area of ~430mm wide and ~323mm tall; that comes out to right around 1792x1344 dots, and running at that resolution doesn't aliasing issues. The display supports inputs up 2058x1536 as well, but even with that being notable less than one dot per pixel it doesn't cause aliasing, it just looks a bit blurrier than lower resolutions.

Ahh. I suspect that it's going through a low-pass filter first which would fix nasty problems with the aliasing in the X direction. Since it's a trinitron (Sony?) it very likely is using an "aperture grill" system and so there are no dots in the Y direction.
 
"Trinitron" is a trademark of Sony for their aperture grill CRTs, but I just used that example as I have one sitting in the other room to easy measure. I didn't intend for the design of the phosphor layer to cause confusion, so to aviod that; how about this this 19" shadow mask display I dug up as an example instead:

http://www.newegg.com/Product/Product.asp?Item=N82E16824002248

Being 19" total screen size and an aspect ratio of 4:3 making the tube ~360mm wide, and width at .25mm dot pitch giving ~1440 columns of dots. That comes out to just a bit over a dot for each pixel at the recommended resolution of 1280x1024. From what I have seen that is generally how CRT monitors' size/dot-pitch/resolution work out. I can't say I even know of an example of a tube that uses a "number of dots per pixel" to run at it's recommended resolution.
 
"Trinitron" is a trademark of Sony for their aperture grill CRTs, but I just used that example as I have one sitting in the other room to easy measure. I didn't intend for the design of the phosphor layer to cause confusion, so to aviod that; how about this this 19" shadow mask display I dug up as an example instead:

http://www.newegg.com/Product/Product.asp?Item=N82E16824002248

Being 19" total screen size and an aspect ratio of 4:3 making the tube ~360mm wide, and width at .25mm dot pitch giving ~1440 columns of dots. That comes out to just a bit over a dot for each pixel at the recommended resolution of 1280x1024. From what I have seen that is generally how CRT monitors' size/dot-pitch/resolution work out. I can't say I even know of an example of a tube that uses a "number of dots per pixel" to run at it's recommended resolution.
Busting out the trusty windows calc and seeing how you got those numbers, does that mean that on a 17" total size monitor with .27 dotpitch the max res you should run is 1152x864 since it has about 1192 dots, or maybe push it a little to 1200x900?
I typically use 1200x900 on my viewsonic E771 17" which has a .27mm diagonal dotpitch (which should be used?) just because it's the highest res I can run at 75hz refresh rate.
Custom resolutions own :)
Oh and why must monitors companies use 1280x1024 as a recommended res when it will distort the picture on a 4:3 monitor?
 
Radeonic I'm just speaking from end user experience and looking at the numbers and don't have any qualifications here beyond that. I'd think 1280x960 would work well your Viewsonic despite being a bit beyond the dot-pitch though, at least that resolution has always suited me fine with similar 17" displays. As for why a 5:4 resolution is so often recommended for 4:3 displays, best I can tell it is just because most people don't know any better and 1280x960 would look like notably less when compared to 1280x1024 in marketing.
 
Trinitrons and their stupid horizontal lines. I hate them. :mad:
 
Radeonic I'm just speaking from end user experience and looking at the numbers and don't have any qualifications here beyond that. I'd think 1280x960 would work well your Viewsonic despite being a bit beyond the dot-pitch though, at least that resolution has always suited me fine with similar 17" displays. As for why a 5:4 resolution is so often recommended for 4:3 displays, best I can tell it is just because most people don't know any better and 1280x960 would look like notably less when compared to 1280x1024 in marketing.
Well I use 1200x900 because its the highest res I can do 75hz at, 1280x960/1024= 66hz.. flicker city.
 
I'm a bit more lucky regarding frequency, my monitor (17" IBM G76) does 75Hz in 1280, 85Hz in 1152, 100Hz in 1024 etc.
still, I figured out that if you can do 75Hz at 1200x900 and 66Hz at 1280 then I'd be able to do 1200x900 85Hz (85Hz is my threshold).

I added it with the nvidia drivers, nice resolution, thanks :)
not a huge gap from 1152 but it's cool.. and looks better than 1152. 1152 apparently has weird, odd mapping with the dot pitch. 1024 and 1280 didn't have it, and 1200x900 doesn't have it either.

and, it works in games, at 85Hz, not terrible 60HZ! :)
(if you can set the custom resolution in the console like on quake3 engine, or in a .ini/.cfg file, or the game might list all modes including the custom one)

1200x900 is not listed in nvidia refreshrate override, but only the 1200x900 85Hz mode exists so games use that I guess. (I added 1200x900 16bits 85Hz too after trying Q3 in that resolution and it was a funky windows desktop/Q3 mess)

I'm also lucky to have such a good monitor with "normal" tube (FST). no boring black lines.
 
Last edited by a moderator:
My 19 inch .27dpi CRT looked best at 1280*1024.

At 1280*960 I got horrid moire problems.
I think it has a lot to do with vertical dpi being different to horizontal dpi.

Also, it was blurry above 1280*1024 so I played at that with better AA/AF instead of 1600*1200 with lower AA/AF.
 
Ahh. I suspect that it's going through a low-pass filter first which would fix nasty problems with the aliasing in the X direction. Since it's a trinitron (Sony?) it very likely is using an "aperture grill" system and so there are no dots in the Y direction.
Because the electron beam is never perfectly focused, the tube itself acts as a low-pass. So there should be no aliasing even when the intensity changes "mid-dot". No need for an additional low-pass filter.
 
Because the electron beam is never perfectly focused, the tube itself acts as a low-pass. So there should be no aliasing even when the intensity changes "mid-dot". No need for an additional low-pass filter.
I agree that the beam is likely to be a few dots in width (i.e. something like a Gaussian filter), but I suspect that the D to A conversion and other electronics will still be performing some additional lowpass filtering for the horizontal direction.
 
Back
Top