At this point I'm going to have to drop out of the conversation because I don't understand what you're talking about. I'd have to see it in action for myself. All I know is I played plenty of games on CRTs and none of them had anything particularly more or less jarring. 50/60 fps was smoother in motion. Some games had inconsistent framerates. I see nothing about evenly spaced frame changes as being visually problematic, and apparently neither can anyone else in this discussion (

), which isn't doing anything at all to further your original enquiry!
From what I understand Squeak believes that how much time an image spends on Even lines versus Odd lines of an interlaced display affects or enhances what he thinks of as the combing effect.
In that respect a 20 FPS game/movie/whatever would have each image alternating with odd/even/odd scan lines and even/odd/even scan lines in interlaced mode. That bothers him even if in actual practice you wouldn't be able to distinguish that.
Conceptually he likes the idea of each image containing equal parts odd and even lines, believing that it would produce a superior and more coherent sequence of images. Again, in actual practice it doesn't matter.
In practice, however, if you were going to actually notice it, you'd only ever notice it every 3rd frame at 20 FPS and ever 4th frame at 15 FPS. And interestingly enough ever single frame at 60 FPS or every other frame at 30 FPS.
In reality people just don't notice that on a CRT due to a variety of factors, not the least of which phosphor decay of a previous frame completely fading out prior to a new frame being drawn on screen. The glow from a phosphor "bleeding" from one "line" to the next also helps mask the effect of image persistence in a person's mind. IE - even with mental image persistence, phosphor glow makes it almost impossible for memory of a frame to definitively "assign" any "line" of that image as coming from an odd or even scan line field.
- Fun experiment to try if you want to see this phosphor glow effect. Hook up a computer to a CRT TV (not monitor) via S-Video or better yet Composite video. Then set your desktop resolution to 480p. Text will be almost illegible as neighboring scan lines bleed into each other. Heck, each "pixel" of a PC image bleeds into the neighboring "pixel" of the same scanline. Even worse if you use RCA cables as the vast majority of people did with their CRT TVs.
- CRT displays for PC use were manufactured with significantly tighter tolerances than CRT displays for TVs. Here you might have a chance to see that combing effect between frames of an interlaced video stream.
On an LCD if you were to do this, that would be very evident as the lines from the previous frame would still be displayed on screen when the new lines for the new frame are displayed. Additionally, each line of the display is very distinct from neighboring lines. Hence, good LCD displays automatically composite interlaced frames when it detects an interlaced signal. Part of the processing modules in most LCDs nowadays.
Simple encoders encoding an interlaced video stream however, will preserve the interlaced nature of a video stream and so you can still see what an interlaced video would look like on an LCD display. So, you can still find those wonderful early encodes of interlaced content full of interlaced glory on the internet.

Unfortunately that also gives a completely distorted and very incorrect view/understanding of how interlaced content was perceived on interlaced CRT displays.
Regards,
SB