CRTs don't really have a "resolution" since they're analog.
SD CRTs do more or less have a resolution, actually.
-Each line of video is an analogue signal, but you still have a discrete number of pixels along the vertical axis ("480", though I guess there are different ways of measuring it).
-As for the other axis. Even though the signal is analogue, the phosphor mask has discrete areas for discrete colours. Even for an aperture grille display, where you can get away with scanning variable numbers of lines, there will be a discrete number of pixels along a line.
(I suppose it would be theoretically possible to have a sideways aperture grille, where each colour element is scanned continuously by its own electron gun during a line scan. But you'd still have a resolution along the vertical axis, and I'm sure there would be various practical issues with doing that, such as physical instabilities in the mask. Not like it matters anymore, though.)
PS2 games that run in field-rendered interlace (only rendering every other line to save framebuffer RAM at 60fps) will probably look extremely crap on a HDTV since interlaced video is generally two fields of the same frame, not just a stream of discrete unique fields like the PS2 puts out in this video mode. A CRT doesn't care, it just displays the video, but a HDTV or other discrete pixel display will try to combine two fields to create a frame (as is the norm with analog SDTV), and when the two fields don't match up the result will look very odd...
CRTs are themselves not immune to the oddities of interlacing, even if they handle it better than fixed-pixel displays (presumably due to their field timing?). It
feels like a greater proportion of big-budget games targeted 60fps in the sixth gen, and if that's true, I wouldn't be surprised; having only 17ms between each field results in much less visible combing than having 33ms between every other field. That's an issue that disappears when you're working with progressive scan video output.