Can 20fps be smoothly produced on interlaced displays *spawn

Sure you will have combing when a new field is drawn, but that is regularly spaced, coinciding with change in the image (if any).

On a CRT, there will be zero combing, ZERO. The phosphors decays in much less time than it takes to draw a field (16ms).

Cheers
 
On a CRT, there will be zero combing, ZERO. The phosphors decays in much less time than it takes to draw a field (16ms).

Cheers
Persistence of vision and all that. Plus the decay of phosphors is a whole field (no pun) in and of itself.
 
Last edited:
Sorry squeak, but you seem to have proved to be grossly incorrect and ignorant about the topics at hand here, refusing to let go of your opinions when multiple different people who are agreeing with each other are trying to tell you you are wrong. I don't know how this has any future.
 
Sorry squeak, but you seem to have proved to be grossly incorrect and ignorant about the topics at hand here, refusing to let go of your opinions when multiple different people who are agreeing with each other are trying to tell you you are wrong. I don't know how this has any future.

I don’t see you guys are agreeing at all.
I see you thinking you are, or wanting to think so.
If you try to actually make logical sense of what you write, there’s holes.

It’s really very simple: 20 fps in NTSC will look choppy because the first in a set of odd and even fields will show a static continuous image.
The next odd field will be different (assuming change in the scene of course) and so will the next even field.
Then starting over.
 
On old CRT displays, the signal could tell the monitor to keep reusing te same lines every frame, effectively making the scanning progressive. That's what happened with most games from n64 generation and before it.
So even though you are wrong in how you understand the way interlaced images are percieved by our visual system, we should at least agree its irrelevant to how it affects n64 games such as ZOoT.
Can you aknowledge that?

Then to interlacing, what people are trying to tell you is that, even if you have two different interlaced frames match the same image, there will still be a mismatch between the previous and the following one whenever a new frame is introduced. You are assuming our eyes selectively only choose to blend in the images from frame A to that of B, and the ones from C to D, without blending B with C. Which is ridiculously shortsighted. About as shortsighted as not seeing how 20fps is possible in 60fps diaplays without un-even fram lenghts and without requiring any frame-skipping which you initially said would be the case, and are now trying to say you didn't. The kind of combing artifact you describe sure happens when watching interlaced data on LCD progressive moniors without a proper way to remove the interlacing artifact in place. It was not what happened in actual CRTs.
 
What?
You mean on HD CRTs?
Oot is be before any meaningful penetration of progressive scan capable TV screens in the home.
You can mess with the signals to force SD CRTs to display all-even or all-odd frames.

This is what causes the "scanlines" effect in many old games; they're true progressive 240p60, and "half of the screen's lines are missing."

Fun fact about scanlines: if you very slowly tilt the camera in a 480i 30fps game while eye-tracking an object on screen, so that your vertical eye-tracking rate is synchronized to the offsetting of drawn lines from one field to the next, you can see scanlines. It's surprisingly easy to do in first-person shooters.

It is in the sense that it give undue weight to some frames, while others are feathered out with interlacing.
No it doesn't. All frames on an interlaced CRT always comb on their entry and on their exit. If all frames have an equal number of even and odd fields, then it always happens on an even->odd transition or always happens on an odd->even transition. If frames alternate between having one extra even field or one extra odd field, then it alternates between happening on even->odd transitions and odd->even transitions.

The fact that the combing alternates polarity doesn't really matter. If you start up 60fps forcing the camera to alternate rapidly between two positions between every frame (i.e. a lot of camera glitches in games), the combing still pretty much just looks like combing.

With 30 fps you will never have interlacing artifacts if you are going with odd and even fields.
If the speed is 15 or 30 there will be no interlace combing patterns.
I still regularly play games on SD CRTs. 30fps games have very obvious combing artifacts.

The next odd field will be different (assuming change in the scene of course) and so will the next even field.
No it won't. After an odd field is different, the next even field will match that odd field to produce what you're calling a "static continuous image."

On a CRT, there will be zero combing, ZERO. The phosphors decays in much less time than it takes to draw a field (16ms).
Combing isn't due to phosphor decay rate, it's due to persistence in the human's visual systems.

The kind of combing artifact you describe sure happens when watching interlaced data on LCD progressive moniors without a proper way to remove the interlacing artifact in place. It was not what happened in actual CRTs.
Combing looks much worse when fields are combined trivially on an LCD than when displayed natively on a CRT, but it's still visible in the latter case.
 
Last edited:
Sure you will have combing when a new field is drawn, but that is regularly spaced, coinciding with change in the image (if any).
Not like with the 20 fps example, where there is a two fields with no movement between them and then two with.

20 fps is no different than 30 fps or 20 fps in this regard. All have a certain number of fields drawn from a single frame. At 60 fps you have 1 field per frame. At 30 fps you have 2. At 20 you have 3. At 15 you have 4.

None of these are magic numbers. When the image the field is generated from changes, the human eye (if sensitive, if close enough) sees a "combing" like effect.

Pacing of frame changes, as represented by fields, is as constant and exact at 20 fps as at 30 or 15 fps.

The effect when the frame the fields are generated from changes is the same. Having an even number of fields pulled from each frame does not negate this. If it did, 60 fps would not work.

At this point I'm going to have to drop out of the conversation because I don't understand what you're talking about. I'd have to see it in action for myself. All I know is I played plenty of games on CRTs and none of them had anything particularly more or less jarring. 50/60 fps was smoother in motion. Some games had inconsistent framerates. I see nothing about evenly spaced frame changes as being visually problematic, and apparently neither can anyone else in this discussion (;)), which isn't doing anything at all to further your original enquiry!

Don't worry, you've not missed anything. 20 fps is no better or worse with regards to this than any other frame rate, except with regards to changes in camera between frames being more jarring (in that way 15 fps is worse and with no benefits).

For people used to LCD, Plasma with it's super sharp frame transitions used to cause *some* to see a "double image" that they found jarring. This was because their visual system could still perceive the previous frame as they were processing the next. It's the same effect causing "combing" on an interlaced display when two successive fields come from two successive frames. Your brain thinks you can 'see' information from two frames at the same time - on progressive scan this appears to be a double image, on an interlaced set it's like two interleaved, "combed" images.

The larger the set, the closer you are, the more you could see it on an interlaced crt.

In practice, with the small crts of the time and the viewing distances, and kids growing up with these displays and learning to filter it out, lots of people never noticed.
 
With 15 fps you display one video game frame over four fields or two NTSC frames. No interlacing artifacts.

This is completely incorrect, and is at the root of your misunderstanding of this current issue (now we've moved on from "can an interlaced set display 20 fps").

Each field is interlaced with the previous and the next. Because they're interlaced and because the fields alternate. Because the fields swap each time, each changed frame will necessarily be on either side of an opposing field change.

The eye (and image processing parts of the brain) don't and can't give a hoot about which NTSC frame a field is supposed to belong to.

When the image two subsequent fields are drawn from changes, you get the same effect regardless of the frame rate.
 
Combing looks much worse when fields are combined trivially on an LCD than when displayed natively on a CRT, but it's still visible in the latter case.

Yeah, combing is not exclusive to LCD, but what I meant is, this notion of "magic numbers" squeak was defending, migh have arisen from the way interlaced signals can be experienced on such displays. It's true that 15fps would look more solid thar 20fps on that situation, but that situation alone. Itnot how it works on actual CRTs though.
 
At this point I'm going to have to drop out of the conversation because I don't understand what you're talking about. I'd have to see it in action for myself. All I know is I played plenty of games on CRTs and none of them had anything particularly more or less jarring. 50/60 fps was smoother in motion. Some games had inconsistent framerates. I see nothing about evenly spaced frame changes as being visually problematic, and apparently neither can anyone else in this discussion (;)), which isn't doing anything at all to further your original enquiry!

From what I understand Squeak believes that how much time an image spends on Even lines versus Odd lines of an interlaced display affects or enhances what he thinks of as the combing effect.

In that respect a 20 FPS game/movie/whatever would have each image alternating with odd/even/odd scan lines and even/odd/even scan lines in interlaced mode. That bothers him even if in actual practice you wouldn't be able to distinguish that.

Conceptually he likes the idea of each image containing equal parts odd and even lines, believing that it would produce a superior and more coherent sequence of images. Again, in actual practice it doesn't matter.

In practice, however, if you were going to actually notice it, you'd only ever notice it every 3rd frame at 20 FPS and ever 4th frame at 15 FPS. And interestingly enough ever single frame at 60 FPS or every other frame at 30 FPS.

In reality people just don't notice that on a CRT due to a variety of factors, not the least of which phosphor decay of a previous frame completely fading out prior to a new frame being drawn on screen. The glow from a phosphor "bleeding" from one "line" to the next also helps mask the effect of image persistence in a person's mind. IE - even with mental image persistence, phosphor glow makes it almost impossible for memory of a frame to definitively "assign" any "line" of that image as coming from an odd or even scan line field.
  • Fun experiment to try if you want to see this phosphor glow effect. Hook up a computer to a CRT TV (not monitor) via S-Video or better yet Composite video. Then set your desktop resolution to 480p. Text will be almost illegible as neighboring scan lines bleed into each other. Heck, each "pixel" of a PC image bleeds into the neighboring "pixel" of the same scanline. Even worse if you use RCA cables as the vast majority of people did with their CRT TVs.
    • CRT displays for PC use were manufactured with significantly tighter tolerances than CRT displays for TVs. Here you might have a chance to see that combing effect between frames of an interlaced video stream.
On an LCD if you were to do this, that would be very evident as the lines from the previous frame would still be displayed on screen when the new lines for the new frame are displayed. Additionally, each line of the display is very distinct from neighboring lines. Hence, good LCD displays automatically composite interlaced frames when it detects an interlaced signal. Part of the processing modules in most LCDs nowadays.

Simple encoders encoding an interlaced video stream however, will preserve the interlaced nature of a video stream and so you can still see what an interlaced video would look like on an LCD display. So, you can still find those wonderful early encodes of interlaced content full of interlaced glory on the internet. :D Unfortunately that also gives a completely distorted and very incorrect view/understanding of how interlaced content was perceived on interlaced CRT displays.

Regards,
SB
 
Last edited:
From what I understand Squeak believes that how much time an image spends on Even lines versus Odd lines of an interlaced display affects or enhances what he thinks of as the combing effect.

In that respect a 20 FPS game/movie/whatever would have each image alternating with odd/even/odd scan lines and even/odd/even scan lines in interlaced mode. That bothers him even if in actual practice you wouldn't be able to distinguish that.
Whereas I think he's saying that some of the fields will match the previous field for frame content, and then you'll get a new frame that interlaces with the previous frame, and it's the irregularity of the full-frame pacing that makes a problem. In this chart, each letter represents a game frame.

Code:
Frame  1 2 3 4 5 6 7 8 910111213
60Hz
Top    A C E G I K M O Q S U W Y
Bottom B D F H J L N P R T V X Z
30Hz
Top    A B C D E F G H I J K L M
Bottom A B C D E F G H I J K L M
20Hz
Top    A A B C C D E E F G G H I
Bottom A B B C D D E F F G H H I
At 60 fps, every frame is paced at a new scan, so it all looks regular. At 30 fps, they are drawn in matching pairs. At 20 Hz, every first frame is complete whereas every other frame is a mix of two. Therein lies Squeak's problem (I think).

Which still doesn't impact vision because the human visual system doesn't work that way and all we see is either of...

Code:
ABCDEFGHIJKLMNOPQRSTUVWXYZ

AABBCCDDEEFFGGHHIIJJKKLLMMN

AAABBBCCCDDDEEEFFFGGGHHHI
There's a scanning dot that draws data from a frame. It'll draw the data from one frame until it swaps. If it swaps mid-frame, you get tearing, but tearing is only noticeable if the frames are different. If it swaps at the start of a new field, you don't perceive the change and just have continuous motion. The CRT scans three times for one frame, then three times for the next; doesn't make any difference to one's brain whether it starts those three scans with the first or second set of lines.

Dunno. All we can say so far is no-one is jumping in with support for Squeak's arguments. ;)
 
Last edited:
Fun experiment to try if you want to see this phosphor glow effect. Hook up a computer to a CRT TV (not monitor) via S-Video or better yet Composite video. Then set your desktop resolution to 480p. Text will be almost illegible as neighboring scan lines bleed into each other. Heck, each "pixel" of a PC image bleeds into the neighboring "pixel" of the same scanline. Even worse if you use RCA cables as the vast majority of people did with their CRT TVs.
The choice of cable actually has somewhat mild effect on the visibility of combing on an SD CRT. The signal type only affects horizontal clarity, not vertical, and even composite video can produce a horizontal resolution that's much finer than common frame-to-frame deltas.

By the way, the lack of focus in the electron beam that you're calling "phosphor glow" is to some degree intentional. It's just like many antialiasing resolve filters in modern games that include samples from neighboring pixels; trade a softer image for reduced reconstruction aliasing. Not great for fine text, but it can make for a perceptually cleaner image on the whole.
 
I did some research on 240p and while familiar with the concept, I found I never really understood it right (and neither, it seems do some of you guys fully).
I know that the line change signal was encoded in the signal with TV broadcasts, but always assumed that the signal output by a console or home computer would had to stick strictly within spec to have an acceptable image (for image brightness, timing and image consistency reasons).
I always assumed and was affirmed, that the talk of 240p was just a way of saying "two identical fields repeated".

Fascinating knowledge!

Ok, so now I see how Oot could run 20 fps.

But, we trailed off to generalities about interlacing and 20 fps.

I struggle to think of any interlaced games (that would have to be 480i, so PS2 and that generation), that might have run at 20fps or had reason to do so?

I still stick to my guns, that it would look less than smooth had anyone attempted it though.
I think I've gone as far as possible without turning to illustration or animation though. So that will have to wait.
 
Last edited:
I struggle to think of any interlaced games (that would have to be 480i, so PS2 and that generation), that might have run at 20fps or had reason to do so?
It wouldn't have to be sixth-gen or later, plenty of earlier consoles supported 480i modes.

I'm not sure if there were any games that explicitly targeted 480i20 and sustained it in a steady way. Certainly some games fluttered about there at times, i.e. Perfect Dark in high-resolution mode.
 
It wouldn't have to be sixth-gen or later, plenty of earlier consoles supported 480i modes.

I'm not sure if there were any games that explicitly targeted 480i20 and sustained it in a steady way. Certainly some games fluttered about there at times, i.e. Perfect Dark in high-resolution mode.
That's right, even Playstation had a game or two doing high res.
I'd be surprised if the Saturn, Jaguar and 3DO didn't have examples too.
Anyone know any examples of how framerate drops, or low steady frame rates where handled in these few games?
 
That's right, even Playstation had a game or two doing high res.
I'd be surprised if the Saturn, Jaguar and 3DO didn't have examples too.
Anyone know any examples of how framerate drops, or low steady frame rates where handled in these few games?
There should be quite a few on the Amiga and framerates are likely low, but they may be mostly sims and adventures. If you could find a 3D game in high-res mode, it'd be low framerate.
 
I struggle to think of any interlaced games (that would have to be 480i, so PS2 and that generation), that might have run at 20fps or had reason to do so?

OoT mostly ran at 20 fps, and Daytona USA on the Saturn (near launch title) was locked to 20 fps (and letterboxed) because Sega hadn't yet extracted enough performance from their fiddly and over complicated system. Panzer Dragoon was another 20 fps title (Zwei hit 30 while increasing detail on ... everything).

[Daytona and PD ran at 16.67 fps in PAL land, and yeah, you fucking noticed!]

There were actually lots of 32/64 bit games that could drop to 20 fps. Halo on Xbox could too, come to think of it, and that was a 480 title.

I still stick to my guns, that it would look less than smooth had anyone attempted it though.

Your guns are demonstrably misfiring. It's probably time to accept that you've carried notions with you for decades that are wrong. Kind of like me thinking that Star Wars was too cherished to be violated on multiple levels.

This is a hallelujah moment! .... :/
 
OoT mostly ran at 20 fps, and Daytona USA on the Saturn (near launch title) was locked to 20 fps (and letterboxed) because Sega hadn't yet extracted enough performance from their fiddly and over complicated system. Panzer Dragoon was another 20 fps title (Zwei hit 30 while increasing detail on ... everything).

[Daytona and PD ran at 16.67 fps in PAL land, and yeah, you fucking noticed!]

There were actually lots of 32/64 bit games that could drop to 20 fps. Halo on Xbox could too, come to think of it, and that was a 480 title.



Your guns are demonstrably misfiring. It's probably time to accept that you've carried notions with you for decades that are wrong. Kind of like me thinking that Star Wars was too cherished to be violated on multiple levels.

This is a hallelujah moment! .... :/
No I'm pretty damn sure. Until you guys show me an interlaced game that runs at 20 fps. And it runs smoothly.
 
Last edited:
No I'm pretty damn sure. Until you guys show me an interlaced game that runs at 20 fps. And it runs smoothly.

Your argument has gone from 20 fps is impossible for N64 (factually wrong), to 20 fps would lead to additional interlacing artefacts (factually wrong), to no-one has attempted 20 fps (factually wrong), to "show me 20 fps running smoothly on 480i".

For fucks sake.

No-one can show you 20 fps 480i running smoothly because 20 fps isn't smooth. Because it's 20 fucking fps. But it can have frame pacing as regular as any other update rate. And if frame pacing is regular, it will look a damn sight smoother than than the 15 fps you quite outrageously claimed would be preferable.

How many more morphs is your argument going to to through before you just admit you made an outrageously false claim to begin with?
 
Your argument has gone from 20 fps is impossible for N64 (factually wrong), to 20 fps would lead to additional interlacing artefacts (factually wrong), to no-one has attempted 20 fps (factually wrong), to "show me 20 fps running smoothly on 480i".

For fucks sake.

No-one can show you 20 fps 480i running smoothly because 20 fps isn't smooth. Because it's 20 fucking fps. But it can have frame pacing as regular as any other update rate. And if frame pacing is regular, it will look a damn sight smoother than than the 15 fps you quite outrageously claimed would be preferable.

How many more morphs is your argument going to to through before you just admit you made an outrageously false claim to begin with?
My claim was A. Based on the erroneous assumption that interlacing took place B. Never that it was impossible but that it would look bad.
And FFS yourself!
Is this a dick measuring contest, or a place to learn new things out of genuine technical interest?
Of course I'm allowed to be wrong on certain points without the rest of my arguments falling apart. A truism, but apparently not here?
Don't respond to the thread if it's such a travesty and chore.

And by "smooth" I of course mean "appearing regular in frequency". But you seem to know that too in your reply?
Disney animation was 12 fps and looked smooth enough even on a large screen. But of course camera pans was not typically involved in those movies.
15 fps PAL Oot was very playable.
 
Last edited:
Back
Top