Progressive scan @ 30FPS?

MistaPi

Regular
This is not directly a console related question, but with film/games that is displayed interlaced on a TV gives us the impression of 60FPS. But what about movies in progressive scan witch display 1 frame per refresh, how can it still be smooth at 30FPS? Does it "bump" the framerate to 60FPS?
 
MistaPi said:
This is not directly a console related question, but with film/games that is displayed interlaced on a TV gives us the impression of 60FPS. But what about movies in progressive scan witch display 1 frame per refresh, how can it still be smooth at 30FPS? Does it "bump" the framerate to 60FPS?

Film is shot at 24 FPS. They can show it on a 60FPS display using 3-2 pulldown. It shows one frame twice, the next 3 times, the next twice, etc. So, on a progressive scan display it would go 3-2-3-2-3-2 and so on.

For an interlaced display, it has to alternate between even and odd scanlines, so it goes something like (odd-even-odd)-(even-odd)-(even-odd-even)-(odd-even) and so on.

There is also 2-2 pulldown for video (which is shot at 30 fps), which shows each frame twice.
 
There is also another advantage of film -- Natural motion blur.

A single frame of film will capture motion where as a single frame in a video game will not. The eye/brain forgives the lack of updates when there is some perceived motion going on.

If you tried 3-2 pull down (24fps source) on a video game it would still look like a choppy mess as motion would be terribly jumpy.
 
So displaying the same frame 2 or 3 times makes progressive scan smooth? Even if it is really only 30 individual frames?
 
MistaPi said:
So displaying the same frame 2 or 3 times makes progressive scan smooth? Even if it is really only 30 individual frames?

Like bobbler said, film (and video) have natural motion blur. Think of it like this. For one frame of film, the shutter was open for 1/24 of a second. So, that frame contains all the motion that went on during that 1/24 of a second. With 3D graphics rendering there is no motion blur. Each frame is actually just one instantaneous moment in time with no motion at all. That's why films look smooth as silk at 24 FPS, while a video game would look choppy.
 
Yes. Not to be confused with the motion blur you see in the current generation of games.
Sometimes they get it kinda right but realistic motion blur is one hell of a beast to replicate in realtime. It makes offline renders even slower than they already are, and by a long shot too if you use the best quality 3D motion blur.
Much easier to just render at 60fps or more and leave it at that.
 
For one frame of film, the shutter was open for 1/24 of a second. So, that frame contains all the motion that went on during that 1/24 of a second.
It's 1/12th of a second for film. Each frame on a filmstrip is actually doubled up. Only the film reel actually runs at 24 fps, but the video itself is actually a 12fps source. And there's apparently an actual measurement that 1/12th of a second is the human eye latency -- that is, a minimum of 12 fps is necessary for you to perceive something as motion -- which is not to say that your eyes don't collect information any faster.

The motion blur simulations on video games are really more accurately "feedback" effects. The approach being that you take a faded out version of the previous rendered frame and blend it in with the current, and you keep accumulating this. Looks more trippy than motion blurred because true motion blur is not about the previous frames, but about what happens in between two instantaneous moments.

Essentially, uncertainty principle applies. With discrete frames, you have information only about the position of something at some specific time. With a single motion blurred frame, you have information only about the movement over a period of time. A movie is simply an effort to give you both pieces of information by giving you short-time motion blurs.
 
ShootMyMonkey said:
For one frame of film, the shutter was open for 1/24 of a second. So, that frame contains all the motion that went on during that 1/24 of a second.
It's 1/12th of a second for film. Each frame on a filmstrip is actually doubled up. Only the film reel actually runs at 24 fps, but the video itself is actually a 12fps source. And there's apparently an actual measurement that 1/12th of a second is the human eye latency -- that is, a minimum of 12 fps is necessary for you to perceive something as motion -- which is not to say that your eyes don't collect information any faster.
I don't think that's correct. Film source is definitly 24 fps, and it's doubled or trippled up to 48 or 72 to be shown on the big screen.
 
One shouldn't forget to mention that not everything runs silky smooth: objects that travel across the screen in a split of a second sometimes come across as "choppy" especially on a big display. It's barely noticable for the most part on a small screen though.
 
Phil said:
One shouldn't forget to mention that not everything runs silky smooth: objects that travel across the screen in a split of a second sometimes come across as "choppy" especially on a big display. It's barely noticable for the most part on a small screen though.

Definately annoying.

I remember most when i was at the movies, watching LOTR, when they do those long pans through landscapes, you could see it jerky and choppy. Very very annoying. Even motion blur couldn't hide that, and though it is only noticeable in those rare occasions (which are not very rare in the LOTR movies), when you see it it really stand out.
 
I don't think that's correct. Film source is definitly 24 fps, and it's doubled or trippled up to 48 or 72 to be shown on the big screen.
IMAX film is run that way -- IMAX uses higher film sizes and framerates, so it runs at 72, and if a film is shot with the intention of showing in IMAX or both types of theatres, it'll be filmed at a 24 source. Otherwise, budget rules apply, and if it proves too costly, it won't happen. The default is still 12 fps, and I'm sure it wouldn't be too hard to find the actual study that measured the mean human eye latency at around 83 ms. Animated films are still 12 fps for that reason (Disney's standard still applies).
 
ShootMyMonkey said:
I don't think that's correct. Film source is definitly 24 fps, and it's doubled or trippled up to 48 or 72 to be shown on the big screen.
IMAX film is run that way -- IMAX uses higher film sizes and framerates, so it runs at 72, and if a film is shot with the intention of showing in IMAX or both types of theatres, it'll be filmed at a 24 source. Otherwise, budget rules apply, and if it proves too costly, it won't happen. The default is still 12 fps, and I'm sure it wouldn't be too hard to find the actual study that measured the mean human eye latency at around 83 ms. Animated films are still 12 fps for that reason (Disney's standard still applies).

I still don't think that's right. Film is 24 FPS and each of those frames is projected twice on the screen. Maybe animated films are 12 FPS, but I don't think that makes it "the default".
 
ShootMyMonkey said:
The default is still 12 fps
No, theatrical movie cameras shoot film at 24fps as standard. The theatrical movie projector shows 24 discrete frames of images per second but "blinks" each frame on and off rapidly two times (or maybe more in some cases) to trick the eyes into seeing more.

Animated films are still 12 fps for that reason (Disney's standard still applies).
No again, animated film is 12fps because that is cheaper to produce; only half the work needed to produce the movie. Early disney movies (shorts and features) from what you might call the "golden era" of cartoons most definitely ran at 24fps.
 
Yup, film is definitively 24fps. If it wasn't you wouldn't need 3:2 oulldown for 60hz playback, you could just show each frame 5 times. And film apears smooth (actually, if you really look I'll bet you could spot many places where it doesn't look smooth.) not only because of motion blur, but because, when they make movies, they actually make an effort to avoid situations that will make the low framerate too obvious.

Also, the latency of the eye is irrelevant, human vision is continuous, so even with huge latency it's still possible to see the difference between 100fps and 200fps.

Finally, as already said, when film is played back each frame is displayed 2 or 3 times for 48 or 72Hz. Whats interesting is that while this reduces flickering, it also reduces image quality for movement. Example: if you have a white dot moving across a black screen at 60fps with 60Hz refresh it'll look like a white dot moving smoothly. But if you keep the fps at 60 but increase the refresh to 120Hz then something interesting occurs. If you try to follow the dot with your eyes then you'll actually see two dots. That's because your eyes moves smoothly while trying to track the dot, so the two times the dot is displayed in each position will actually register in two different spots in the eye, causing a ghost dot to appear.
 
Thowllly said:
Yup, film is definitively 24fps. If it wasn't you wouldn't need 3:2 oulldown for 60hz playback, you could just show each frame 5 times. And film apears smooth (actually, if you really look I'll bet you could spot many places where it doesn't look smooth.) not only because of motion blur, but because, when they make movies, they actually make an effort to avoid situations that will make the low framerate too obvious.

COUGH*LordOfTheRings*COUGH.


I mean, at the cinema it was horrendous how the landscape pans jittered. Damn shame cause the movie is gorgeous.

60fps movies can't come soon enough.
 
ShootMyMonkey said:
I don't think that's correct. Film source is definitly 24 fps, and it's doubled or trippled up to 48 or 72 to be shown on the big screen.
IMAX film is run that way -- IMAX uses higher film sizes and framerates, so it runs at 72, and if a film is shot with the intention of showing in IMAX or both types of theatres, it'll be filmed at a 24 source. Otherwise, budget rules apply, and if it proves too costly, it won't happen. The default is still 12 fps, and I'm sure it wouldn't be too hard to find the actual study that measured the mean human eye latency at around 83 ms. Animated films are still 12 fps for that reason (Disney's standard still applies).

I remember a test by the military that image latency of a bit less than 5ms was still noticable by trained soldiers.
 
Anyone tryng to convince me I only need 12fps is gonna get a smack on the nose! If I 'relax into a game' on PC I can tolerate lower framerates (out of neccessity) but they're very apparent. I want 60fps!

Quick, LB, let's form an international action group and petition for 60fps and get pop-stars to do free concerts in aid of 60fps. 8)
 
Back
Top