120hz technology in future consoles?

Statix, there is no need to SHOUT. This is not a cheap tabloid.

Wrong, wrong, wrong, WRONG. If the cost of additional film were really the issue (which it's NOT. You people really believe that blockbuster movies with budgets of near HALF A BILLION DOLLARS are worried about the material cost of film?)

In the filming process, probably not. Duplication and distributing world wide and having it compatible with the theatres' existing projection equipment is another matter entirely. That equipment would have to be replaced. If you simply try to run a standard projector at something significantly higher than 24fps you will destroy the film. (Search for Imax and the gentler "rolling loop" technology it uses instead).

... if the cost of film reel required to shoot a film were such an issue, then how do you explain all the movies that were shot DIGITALLY, with digital video cameras, still being 24hz? How do you explain that the big-budget Superman Returns, shot digitally with NO PHYSICAL FILM, is still only 24hz?
I can only guess that it is some combination of (a) compatibility with conversion to film projection systems (b)availability of powerful enough equipment since higher rates do imply significantly higher bandwidth/processing power or (c) storage costs of the end result since 60Hz vs 24Hz is going to need ~2.5x the data <shrug>. Hopefully it's not (d) ignorance.

As well as MANY other big-budget movies that were shot with digital cameras. How also, do you explain that the latest state-of-the-art digital camera, the RED ONE, designed to be the end-all, be-all of professional filmmaker cameras with a maximum resolution that dwarfs the HDTV standard of 1080p, was designed to be shot at 24hz, but it's ALSO capable of 60hz or 60 fps
By "RED ONE" I guess you are referring to this.
A colleague has just said they are rather over hyped and there are better but, FWIW, I see that it supports:
RED offers the Mysterium ™ Super 35mm cine sized (24.4×13.7mm) sensor, which provides 4K (up to 30 fps), 3K (up to 60 fps) and 2K (up to 120 fps) capture, and all this with wide dynamic range and color space in 12 bit native RAW

Does it include real time lossless or virtually lossless compression? I doubt it, so let's assume it's going to store raw data. If we assume 3K@60Hz resolution corresponds to approximately 3000x1700 then assuming they are down filtering from the (likely to be Bayer) 4520 X 2540 pixels sensor to a per pixel 36bits/pixel RGB, then we're looking at 1.28 GB/s. That's going to need a very substantial drive array to cope with (a) the data rate and (b) the stored data for anything more than a few minutes of recording.

(as are most other DV cameras)... YET some of the greatest, most esteemed filmmakers we have today (e.g., Peter Jackson, Steven Soderbergh, George Lucas) who have shot major, mainstream films in greater than 1080p using the state-of-the-art RED ONE digital camera have done so using the 24hz mode, when they could've easily used the 60hz mode if they so chose to. How do you explain that?
See above.

24hz or 24p is an ARTISTIC consideration and decision made by filmmakers. That's a FACT. Moviegoers and filmmakers alike are accustomed to the look and feel of the motion of a 24hz film;
Using that sort of logic one would still be driving around in cars with solid rubber tyres because "we enjoy the look at feel of the motion". (Apologies to other B3Ders for the blatant car analogy)

this is a long-held aesthetic preference that has led to 24 fps being the longstanding, unshakable standard for everything you see on TV and in theaters today.
It was also because it was a physical limitation of the technology. The temporal aliasing alone should be reason enough to move away from it. The IMAX people, OTOH, moved to >24Hz because they saw that it looks better.
I've found that Videogamers are SO oblivious and naive to this fact; they all think that "more is better" or "bigger numbers are better." This is just not the case for me.
Videogames need even greater numbers for two main reasons - (1) because the games are interactive and the response time needs to be higher and (2) the frames are (usually) sampled at one instant in time which leads to temporal aliasing. The higher the frame rate, the lower the temporal aliasing artefacts are.

Film also has temporal aliasing (due to the box filter and the fact that the exposure is only ~1/2 the time period of the frame) but don't take my word for it. Go and read "Computer Graphics. Principles and Practice. Second edition" page 1079.

Let me tell you why: To support the occasional video material that IS 60hz or 50hz, such as some documentary and news programs, and sports programming. EVERYTHING else you see is either 30fps or 24fps. That's nearly EVERY primetime television program (except SNL) and EVERY movie/Blu-Ray/DVD. Period.
Sorry you are 100% incorrect. DVDs ofen contain 50 (PAL) or 60 (NTSC) fields-per-second material.

I personally prefer the look and feel of 24p to 30p
That is your choice (some like grainy black and white for example) but some of us don't want to be limited by old technology. Now excuse me while I go off and listen to my 78s.
 
There's a huge investment in the industry infrastructure for 24p because that's the format that the industry, and audience, has chosen.

I don't think the audience has a say in it. Like the earlier post have already said. 24p was chosen for the theater, so they reduce the number of time they need to change the film roll during viewing...

There's also a huge, established infrastructure supporting for framerates such as 30p and 60p, but those are reserved for news programs, game shows, sports, and other live programming, for a reason.

If 24p was such a wonder, we won't have 50/60i/p. But then again 50/60 was chosen because of AC freq.

Bottom line is that if you suggest to any aspiring filmmaker or film student out there to use 60p instead of 24p, they're apt to laugh at you outright. There's a reason for all the rage that was the Panasonic DVX digital camera, one of the first prosumer-level cameras that amateurs could reasonably afford to offer 24p (24 fps) as a special feature.

Because of lot of existing infrastructure is 24p, like movie theaters, video equipments, etc. However, one of reason to remain in 24p is not having to go through conversion.
 
Statix said:
Bottom line is that if you suggest to any aspiring filmmaker or film student out there to use 60p instead of 24p, they're apt to laugh at you outright.

So why are camera makers moving to support 60p more aggressively at higher resolution? Because the market - ie filmmakers - are demanding superior temporal resolution, and not just for their slow-motion shots.

With regards 30p/60p being supported in infrastructure... well not in cinemas, they're not.
 
In the filming process, probably not. Duplication and distributing world wide and having it compatible with the theatres' existing projection equipment is another matter entirely. That equipment would have to be replaced. If you simply try to run a standard projector at something significantly higher than 24fps you will destroy the film. (Search for Imax and the gentler "rolling loop" technology it uses instead).
That's why, like I already said, there's the option to downconvert 60p movies to 24p for copying onto consumer-level theater film reel copies for distribution.

Yes, there's that option. But filmmakers still chose and choose to go with 24p as the native framerate for their projects.

Does it include real time lossless or virtually lossless compression? I doubt it, so let's assume it's going to store raw data. If we assume 3K@60Hz resolution corresponds to approximately 3000x1700 then assuming they are down filtering from the (likely to be Bayer) 4520 X 2540 pixels sensor to a per pixel 36bits/pixel RGB, then we're looking at 1.28 GB/s. That's going to need a very substantial drive array to cope with (a) the data rate and (b) the stored data for anything more than a few minutes of recording.
And the equivalent shot at 24 fps will still by 0.512 GB/s, which is STILL alot.

Peter Jackson, Steven Soderbergh, George Lucas, Bryan Singer (Superman), and others have all the funds they need. When you have the budget they have, and you're operating on a budgetary level that they are operating in, the difference of a factor of 2.5x for digital disk space isn't going to mean that much to them.

Using that sort of logic one would still be driving around in cars with solid rubber tyres because "we enjoy the look at feel of the motion". (Apologies to other B3Ders for the blatant car analogy)
What can I say? I'm not here to speak for others about the objective, technical superiority of 24p over higher framerates, although I did briefly state my personal preference for it earlier in this thread.

I feel good about my position because people like Steven Spielberg and Peter Jackson feel the same way as I do.

It's not always about "technical superiority of higher numbers." It's about a certain intangible something evoking certain emotions and responses in one's visual senses... things that can't be quantified into numbers or black-and-white technical superiority.

It was also because it was a physical limitation of the technology. The temporal aliasing alone should be reason enough to move away from it. The IMAX people, OTOH, moved to >24Hz because they saw that it looks better.
IMAX also does a lot of gimmicky stuff that are outside of the film industry standard, like 3D and huge, massive screens. They also show a lot of documentaries and nature footage and stuff of that sort. I don't know enough about IMAX to comment about what they do, and the rationale they had for going with higher framerates for certain materials (although they've shown stuff like Dark Knight, Polar Express, and Matrix on IMAX screens, which are all 24p). And I also know that every movie Blu-Ray, DVD, and movie released in theaters today are 24p; if IMAX is the rare exception to this rule, then so be it.

Videogames need even greater numbers for two main reasons - (1) because the games are interactive and the response time needs to be higher and (2) the frames are (usually) sampled at one instant in time which leads to temporal aliasing. The higher the frame rate, the lower the temporal aliasing artefacts are.

Film also has temporal aliasing (due to the box filter and the fact that the exposure is only ~1/2 the time period of the frame) but don't take my word for it. Go and read "Computer Graphics. Principles and Practice. Second edition" page 1079.
I said earlier in this thread that I think 30 fps for games is a better idea than 24 fps, because games usually don't have the amount of natural motion blurring required to interpolate and smoothen out the motion to a satisfactory, visually coherent degree (or "temporal aliasing" as you like to call it).

I'm sure film does have some "temporal aliasing" or jerkiness that some people might perceive as annoying or distracting. Most people don't though, and some even like it. Again, not saying whether that's right or wrong, but stating the fact that that's the mindset of most moviegoers and filmmakers out there.

Sorry you are 100% incorrect. DVDs ofen contain 50 (PAL) or 60 (NTSC) fields-per-second material.
For the main program of a movie DVD?

That is your choice (some like grainy black and white for example) but some of us don't want to be limited by old technology.
For one thing, it's not really a matter of "old technology," as much as it's a matter of an established standard and global/universal preference for a set framerate. In the first place, more isn't necessarily better; running Half-Life at 300 fps isn't going to make it look better than Killzone 2 at 30 fps, or even 20 fps. I would even say that I'm personally quite happy to see Killzone 2 run at 30 fps rather than 60 fps. Remember that filmmakers and people in general aren't as tech-oriented or benchmark-oriented as gamers, and they don't go by the "more is better" dictum of doing things. They judge by their own personal, subjective tastes for how they want something to look; similar to how most art critics are not going to argue that a Picasso is a worse work of art than something detailed/realistic done by Bob Ross.
 
Last edited by a moderator:
Once all movie theaters convert to digital projection, we'll probably start to see some movies shots at 30 or 60p.

Very few movies are shot with IMAX cameras although they do look stunning. When they show a standard hollywood movie on the screen it is bottlenecked at 24p because that is how they shot the movie originally. IMAX is changing it ways and ditching film and going with DLP technology.

Sony is trying to convince Hollywood its LCoS technology is the way forward and since they own a movie studio are trying to leverage that by releasing stuff at 4k. The storage requirements are 10x higher over 2k, which means more cost for theater owners. It looks like Sony is going to loose this battle.


DLP is on its way to become the next movie theater technology standard. It handles 3d well and is very reliable. I'm not aware of any 4k DLP system, but I guess at some point they'll be able to do it.
 
Why would film makers go bother making movies @ 60Hz, face all the hassel of prototype hardware, right through to the post production, limiting them to use only a handful of production studios, which would likely delay the movies release as the enbgineers are going to be learning how to use the tools etc.

Only to have the movie downconverted to 24Hz (no small feat in it'self) so the public can watch it?

I remember reading that cenemas were complaining about the advent of digital projection as the costs would be too much for most of them to upgrade, How many digital theaters are there now? Another move to 60Hz would be no different, too expensive, espeically for those theatres who've alread made the move to digital.
 
Why would film makers go bother making movies @ 60Hz, face all the hassel of prototype hardware, right through to the post production, limiting them to use only a handful of production studios, which would likely delay the movies release as the enbgineers are going to be learning how to use the tools etc. Statix: You're forgetting that there's plenty of 60 hz programming out there, such as news and sports programs, with associated engineers that are fully versed in handling that sort of material.

Only to have the movie downconverted to 24Hz (no small feat in it'self) so the public can watch it?

I remember reading that cenemas were complaining about the advent of digital projection as the costs would be too much for most of them to upgrade, How many digital theaters are there now? Another move to 60Hz would be no different, too expensive, espeically for those theatres who've alread made the move to digital.
The point I'm trying to make is that even if it were completely economical and exceedingly affordable to do 60 hz movies, movie studios and filmmakers would STILL choose 24p, for aesthetic reasons. Mark my words: We're not going to see higher than 24p framerates adopted for majority use in mainstream movies for a long, long time... Maybe not in our lifetimes.

People (filmmakers, filmgoers, film students) are all extremely accustomed to seeing 24p in movies. Anything higher framerate tends to look cheap to them, as someone already mentioned earlier in this thread. For marketing and ticket-sales reasons alone, studios as a whole (forget about the 24p-biased filmmakers individually) are too afraid to release any new big-budget movies in higher than 24p, because they're afraid it won't sell tickets. They're afraid of people complaining that a 30 fps movie looks cheap or shoddy or weird, and the resulting bad hype that would ensue.
 
That's why, like I already said, there's the option to downconvert 60p movies to 24p for copying onto consumer-level theater film reel copies for distribution.
Indeed and I wish they would do it. The temporal aliasing in 24fps film annoys the H out of me.
Yes, there's that option. But filmmakers still chose and choose to go with 24p as the native framerate for their projects.
Which really makes me wonder. Do they actually watch their films.


And the equivalent shot at 24 fps will still by 0.512 GB/s, which is STILL alot.
It's 2.5x less which is not to be sniffed at.

Peter Jackson, Steven Soderbergh, George Lucas, Bryan Singer (Superman), and others have all the funds they need. When you have the budget they have, and you're operating on a budgetary level that they are operating in, the difference of a factor of 2.5x for digital disk space isn't going to mean that much to them.
Maybe, but the budget clearly isn't unbounded so would have to be taken from something else.

What can I say? I'm not here to speak for others about the objective, technical superiority of 24p over higher framerates, although I did briefly state my personal preference for it earlier in this thread.
On a "personal preference" I cannot argue, but on a technical one, 24fps is hardly superior. Can't you see how awful any camera motion looks in the cinema? I find the stuttering, even with a slow pan or dolly really distracting. Higher temporal sampling and playback rates would do wonders to remove this.

I feel good about my position because people like Steven Spielberg and Peter Jackson feel the same way as I do.
Citations please. :)

It's not always about "technical superiority of higher numbers." It's about a certain intangible something evoking certain emotions and responses in one's visual senses... things that can't be quantified into numbers or black-and-white technical superiority.
Again that is personal preference. For example, some may like the look of the jagged lines in computer graphics that can occur due to under sampling resulting aliasing . Personally, I don't and prefer something that has been sampled at a sufficiently high rate. It's exactly the same in the temporal domain.

IMAX also does a lot of gimmicky stuff that are outside of the film industry standard, like 3D and huge, massive screens. They also show a lot of documentaries and nature footage and stuff of that sort. I don't know enough about IMAX to comment about what they do, and the rationale they had for going with higher framerates for certain materials
IIRC from a documentary it was something like to "make it more realistic".
(although they've shown stuff like Dark Knight, Polar Express, and Matrix on IMAX screens, which are all 24p).
Well those weren't made exclusively for IMAX were they?

And I also know that every movie Blu-Ray, DVD, and movie released in theaters today are 24p;
If the source is 24fps then of course you can't just manufacture the missing frames (without either (a) getting it wrong and/or (b) with it being hugely expensive).

I said earlier in this thread that I think 30 fps for games is a better idea than 24 fps, because games usually don't have the amount of natural motion blurring required to interpolate and smoothen out the motion to a satisfactory, visually coherent degree (or "temporal aliasing" as you like to call it).
Ha "Like to call it". That is the correct term. FWIW a 24fps camera is (a) using a box filter (which is not very good) and (b) only has the shutter open for ~1/48th of a second. This leads to the unpleasant aliasing artefacts I've talked about. (FWIW In the cinema each frame is flashed twice leasing to a display at a rate of 48fps probably because it may help reduce the temporal effects. It may also help hide the time when the display must be blacked out as the film is advanced).

I'm sure film does have some "temporal aliasing" or jerkiness that some people might perceive as annoying or distracting. Most people don't though, and some even like it. Again, not saying whether that's right or wrong, but stating the fact that that's the mindset of most moviegoers and filmmakers out there.
Often people will just put up with something because they don't know that better options are available.

For the main program of a movie DVD?
If you look at what you wrote you did not mention movies but claimed that it's 24fps for "nearly EVERY primetime television program (except SNL) and EVERY movie/Blu-Ray/DVD. Period."
 
Citations please. :)

The only movie where the low frame rate is really exploited that I can remember is Saving Private Ryan.

The low frame rate and 1/10000s shutter speed makes it crisp and jerky, clearly a cinematographic decision.

In every other movie there is just tons and tons of motion blur, - and jerky pans.

*edit*: Another would be Gladiator.

The high amount of motion blur in movies in general makes it easier to fake scenes with CGI, but again, that's more exploiting technical limitations rather than artistic intent.

Cheers
 
People (filmmakers, filmgoers, film students) are all extremely accustomed to seeing 24p in movies. Anything higher framerate tends to look cheap to them, as someone already mentioned earlier in this thread. For marketing and ticket-sales reasons alone, studios as a whole (forget about the 24p-biased filmmakers individually) are too afraid to release any new big-budget movies in higher than 24p, because they're afraid it won't sell tickets. They're afraid of people complaining that a 30 fps movie looks cheap or shoddy or weird, and the resulting bad hype that would ensue.

Not so sure about that. I think it's something people will eventually get used to once 120hz motion interpolation become a standard feature. I thought it looked strange at first but now I prefer it over plain ol 24p.
 
Then they could just simply downsample/downconvert the framerate to 24p to compensate for those lower-end theater projectors. See, problem solved.

Directors and filmmakers with a vision for their dream project aren't choosing 24p just because there's a low-end, lowest common denominator to account for. That would just be asinine. I mean, if that were the case, then why stop there? Why not shoot their films in 720x480p since that's the resolution that most people will ultimately be watching their DVDs in anyway?

24p is an artistic consideration, period. Don't argue with me on this.

http://en.wikipedia.org/wiki/24p

have you ever met any? I would certainly agree that they are asses with little to no imagination esp outside of the simple lets tell a story type.

24p is a FINANCIAL consideration. It aligns with where 95+% of their money is coming from. They need to make sure that the shots they are making will work with film projectors and the only reasonable way to do that until the theatre industry either dies or upgrades is to shoot at 24p.

as far as resolution, its generally much easier to downsample resolution with blending than it is to downsample motion/sample rate with blending.
 
Peter Jackson, Steven Soderbergh, George Lucas, Bryan Singer (Superman), and others have all the funds they need. When you have the budget they have, and you're operating on a budgetary level that they are operating in, the difference of a factor of 2.5x for digital disk space isn't going to mean that much to them.

so they are perfectly willing to spend their own money for everything, and to upgrade all the theaters, all duplication machines, etc? All those except Lucas do not make films with their own money, and the only one who has don't anything technical (lucas) had to basically PAY to have a minimal amount of digital projection systems installed.

What can I say? I'm not here to speak for others about the objective, technical superiority of 24p over higher framerates, although I did briefly state my personal preference for it earlier in this thread.

anything visually that can be done in 24p can be done in 48, 96, and 120 frame systems. The opposite is not true. 24p is simply inferior and only around because of historical inertia and cost factors.

I feel good about my position because people like Steven Spielberg and Peter Jackson feel the same way as I do.

they do? you haven't provided any evidence of this fact.

IMAX also does a lot of gimmicky stuff that are outside of the film industry standard, like 3D and huge, massive screens. They also show a lot of documentaries and nature footage and stuff of that sort. I don't know enough about IMAX to comment about what they do, and the rationale they had for going with higher framerates for certain materials (although they've shown stuff like Dark Knight, Polar Express, and Matrix on IMAX screens, which are all 24p). And I also know that every movie Blu-Ray, DVD, and movie released in theaters today are 24p; if IMAX is the rare exception to this rule, then so be it.

Standard IMAX is 24Hz primarily because when it was developed they had a hard enough time as is getting 70mm frame stock to run at 3x the speed of normal projection.

Later IMAX HD was introduced which allowed 48 frames per second but little content is available due to cost and time limitations.
 
The point I'm trying to make is that even if it were completely economical and exceedingly affordable to do 60 hz movies, movie studios and filmmakers would STILL choose 24p, for aesthetic reasons. Mark my words: We're not going to see higher than 24p framerates adopted for majority use in mainstream movies for a long, long time... Maybe not in our lifetimes.

they might do it for cost reasons. but thats about it. stop trying to make an economic decision into an artistic one. Its like people going: sepia is all anyone will ever use, that color stuff is just for technophiles!

People (filmmakers, filmgoers, film students) are all extremely accustomed to seeing 24p in movies. Anything higher framerate tends to look cheap to them, as someone already mentioned earlier in this thread. For marketing and ticket-sales reasons alone, studios as a whole (forget about the 24p-biased filmmakers individually) are too afraid to release any new big-budget movies in higher than 24p, because they're afraid it won't sell tickets. They're afraid of people complaining that a 30 fps movie looks cheap or shoddy or weird, and the resulting bad hype that would ensue.

most students are accustomed to 30 and 60 fields per second as that is what the vast majority of all equipment actually uses. And no, higher frame rates don't look cheap to them or anyone else, its just that historically higher frame rate meant lower resolution WHICH does look cheap.

It the economic factors are the same, everyone will go to a higher sampling frequency. Its quite simple actually, the ONLY thing holding back higher sampling frequency is cost.

And fyi 120Hz is the magic number for so many reasons for display/projection solutions.
 
Directors don't have much say in the matter. It's the theater owners' call. Lucas tried to convince theaters to upgrade thier projectors to digital for SW:EP1 and they didn't go for it. "Why pay money for better projectors when I could not pay money and still sell plenty of tickets?"

Knowing that almost all of the theaters will show the movie at 24fps no matter what they do, the directors wisely choose to continue the tradition of filming at 24fps because that way they can use specific techniques to reduce the crappiness of the format and occasionally take advantage of it (see Private Ryan).

The only thing that scares movie theater owners out of their collective comas is the huge gains in home theater quality. That's why were seeing a small resurgence of 3D movies. It's something you can't get at home (yet) and it's been long enough for people to have forgotten the stigma 3D movies picked up the first time they got popular (lots of cheap gimmic flics followed by a wave of 3D pornos).
 
I see that some of you are still trying to argue with me on this point. 24 fps is, for the vast-majority of the time, a visual consideration, not an economic one. Just go ahead and ask any aspiring filmmaker, film student, any big-name/mainstream established director, and the higher-ups of all the various major movie studios. They'll all tell you the same thing, that 24 fps is currently far and away the preferred look (again, right or wrong, that's subjective) for most primetime television shows and theatrical-release films. They'll all likely say how ridiculous it is, the suggestion that big-name theatrical releases should switch over or try experimenting with 30fps or 60 fps.

If you're truly that unwilling to believe me, then by all means do not take my word for it. Here's what you should do: Take it up with all the cameraphiles/videophiles and amateur filmmakers at either the forums of DVXuser.net or REDuser.net. Go ahead, register at one of those sites, and post a thread asking about why movies don't use 60 fps instead of the traditional 24 fps. Then watch the ridicule and mockery ensue.

Go ahead, try it. I quadruple-dawg-dare you. :)

they might do it for cost reasons. but thats about it. stop trying to make an economic decision into an artistic one. Its like people going: sepia is all anyone will ever use, that color stuff is just for technophiles!
It's simply the truth. People are simply too used to the look and feel of 24 fps film for movies, and whether that's right or wrong (that's not what I'm debating here), the ultimate goal of just about any state-of-the-art digital camera intended for filmmaking is to try and replicate the classic look and feel of 24 fps film. That's just the FACT. People (audiences and filmmakers alike) have grown used to seeing 24 fps, and cannot stand anything higher than that.
most students are accustomed to 30 and 60 fields per second as that is what the vast majority of all equipment actually uses. And no, higher frame rates don't look cheap to them or anyone else, its just that historically higher frame rate meant lower resolution WHICH does look cheap.
When did I say that higher framerates look necessarily cheap on an asbolute, objective basis? Of COURSE it's related to how people have been conditioned while growing up, watching various different programs, and associating 24 fps with high-quality movies and higher framerates with live-action material and lower-quality soap operas or Jerry Springer. That's exactly been my point all along, is that most people have become conditioned to like 24 fps. And that's why so many filmmakers, film students, and film studios do NOT want to change the film standard from 24 fps to 30 fps or higher. They just don't.

And yes, I realize that historically, poor film students have had to use lower-end prosumer cameras that were only capable of 30 fps. However, they were always searching and yearning for ways to achieve the 24 fps, and emulate the general look and feel of 24 fps film. The release of the Panasonic DVX100 camera changed everything, as it was one of the first prosumer-level digital-video cameras to offer the SPECIAL FEATURE of 24 fps. Go ahead and do some google searches on the DVX100, and you'll see what kind of a revolution and inspiration it was when it came out for so many aspiring filmmakers and film students.

It the economic factors are the same, everyone will go to a higher sampling frequency. Its quite simple actually, the ONLY thing holding back higher sampling frequency is cost.

And fyi 120Hz is the magic number for so many reasons for display/projection solutions.
With all due respect, I'm sorry but that's just a naive and incorrect statement, one that lacks awareness for the current mindset of many filmmakers, filmgoers, and film studios all around the world. 24 fps is the clear choice (choice, not restriction) for them. 60 fps has already been suggested as a new standard before, years ago, but it was handily and summarily passed over, ignored, and rejected by all the movie studios.

anything visually that can be done in 24p can be done in 48, 96, and 120 frame systems. The opposite is not true. 24p is simply inferior and only around because of historical inertia and cost factors.
You guys all have to start realizing that this is not about numbers, not simply about better specs, and not about the bigger/newer/higher is better philosophy that so many technophiles and gamers seem to hold.

This is NOT about what 24 fps technicall can't do in relation to 60 fps. This is about the 24 fps LOOKING DIFFERENT from 30 fps and 60 fps. It's actually as simple as that. Whether it's right or wrong, cinematographers love 24 fps and have gotten used to the look and feel of 24 fps. So much so, that even jumping up 6 frames to 30 fps looks detestable to them.

24p is a FINANCIAL consideration. It aligns with where 95+% of their money is coming from. They need to make sure that the shots they are making will work with film projectors and the only reasonable way to do that until the theatre industry either dies or upgrades is to shoot at 24p.

as far as resolution, its generally much easier to downsample resolution with blending than it is to downsample motion/sample rate with blending.
Come on, man. You're going to try and tell me that these movie studios, releasing movies with budgets of near half a billion dollars, can't spare the time and resources to convert a 1-and-a-half hour movie from 60 fps to 24 fps for distribution to movie theaters? It's that difficult, huh? Really? I have a program on my crappy PC that can convert framerates, and I've done it for entire movies before. You're saying I can do it, but these all these huge, rich corporations can't?

[Spielberg and Peter Jackson agree with you about 24 fps?] you haven't provided any evidence of this fact.
How about when Peter Jackson was doing test-runs for the brand-spanking-new, state-of-the-art RED ONE digital camera, and he used the camera to create a short little ~10 min. film project called "Crossing the Line" (i.e. NOT for theatrical or commercial distribution), how come Mr. Peter Jackson very consciously chose and made the decision to film it all in 24 fps?

Note that the RED ONE is state-of-the-art, and is perfectly capable of shooting in framerates of 30 fps, 60 fps, even 120 fps.

Note also, that Peter Jackson was not at all hamstrung or limited by any commercial distribution, old/outdated theater-release considerations while making this little ~10 min. short film. It was just a personal pet-project that he was suggested to do by the makers of the RED ONE as an exhibition for the image quality of their new camera. He could have EASILY shot it all at 30 fps or 60 fps or even 120 fps if he so chooses.

Care to explain that one to me, guys?
 
Last edited by a moderator:
I see that some of you are still trying to argue with me on this point. 24 fps is, for the vast-majority of the time, a visual consideration,

its economic. Really it is, you can disagree all you want but the arguments time and again to switching to something like 48 or 60 has been purely economics. Everyone who has used a true theater quality 48/60 system raves about it, but won't use it do to the enormous cost factors involved in getting the actual theaters to switch to the higher spec gear.

They'll all tell you the same thing, that 24 fps is currently far and away the preferred look (again, right or wrong, that's subjective) for most primetime television shows and theatrical-release films. They'll all likely say how ridiculous it is, the suggestion that big-name theatrical releases should switch over or try experimenting with 30fps or 60 fps.

its not the preferred look. Its simply that the alternative are too expensive. if it is, you should be able to provide proof of such....

Go ahead, register at one of those sites, and post a thread asking about why movies don't use 60 fps instead of the traditional 24 fps. Then watch the ridicule and mockery ensue.

I guess thats why the red cameras only support 24 fps... oh what? they don't well I'll be damned...

It's simply the truth. People are simply too used to the look and feel of 24 fps film for movies, and whether that's right or wrong (that's not what I'm debating here), the ultimate goal of just about any state-of-the-art digital camera intended for filmmaking is to try and replicate the classic look and feel of 24 fps film. That's just the FACT. People (audiences and filmmakers alike) have grown used to seeing 24 fps, and cannot stand anything higher than that.

which classic look and feel? 35? panavision? 70mm? Extended 35? You are just arguing bs at this point.

When did I say that higher framerates look necessarily cheap on an asbolute, objective basis?


historically, higher frame rate required lower rez, not really a factor anymore with 4K+ cameras available capable of >60 fps.



With all due respect, I'm sorry but that's just a naive and incorrect statement, one that lacks awareness for the current mindset of many filmmakers, filmgoers, and film studios all around the world. 24 fps is the clear choice (choice, not restriction) for them. 60 fps has already been suggested as a new standard before, years ago, but it was handily and summarily passed over, ignored, and rejected by all the movie studios.

there is no choice, its currently an economic requirement to shoot 24 fps (and nowadays generally 23.997) due to limitations of the theaters.

This is NOT about what 24 fps technicall can't do in relation to 60 fps. This is about the 24 fps LOOKING DIFFERENT from 30 fps and 60 fps. It's actually as simple as that. Whether it's right or wrong, cinematographers love 24 fps and have gotten used to the look and feel of 24 fps. So much so, that even jumping up 6 frames to 30 fps looks detestable to them.

oh really, then I'm sure you have all kinda of proof!


Come on, man. You're going to try and tell me that these movie studios, releasing movies with budgets of near half a billion dollars, can't spare the time and resources to convert a 1-and-a-half hour movie from 60 fps to 24 fps for distribution to movie theaters? It's that difficult, huh? Really? I have a program on my crappy PC that can convert framerates, and I've done it for entire movies before. You're saying I can do it, but these all these huge, rich corporations can't?

what movie has a budget of 500 million dollars? most movies are in the <100 million range, and the vast majority of the budgets go to talent, both in front of the camera, behind the camera, and in FX.

Doing a framerate conversion with quality is extremely costly, but then again you probably know nothing about quality and think its just pressing a button. There is a reason companies like criterion exist, or that half the BR so far released aren't worth buying. Doing a high quality frame rate conversion takes a lot of time, a lot of money and a lot of people.


How about when Peter Jackson was doing test-runs for the brand-spanking-new, state-of-the-art RED ONE digital camera, and he used the camera to create a short little ~10 min. film project called "Crossing the Line" (i.e. NOT for theatrical or commercial distribution), how come Mr. Peter Jackson very consciously chose and made the decision to film it all in 24 fps?

because we was doing test runs? gee I wonder what he was testing for, could it be to use it for theatrical releases that will have to be at 24 fps? Gee I wonder.

The market that jackson works in requires 24 fps because that is the distribution system. its how he makes his money.
 
It's about 1/48th of a second. The rest of the time is need to physically advance the film. Trying to go to a faster frame rate or a longer exposure would require higher forces on the film.
Okay. In which case 24 fps film does offer something that faster framerates can't, and it will affect the look. The longest shutter speed you'll get on 60 fps material is 1/60th second, probably slightly faster to accomodate capture, sharpening imagery and reducing blur. Depending on how much content is recorded at the slowest possible speed in film, it could have a notable impact. 120 Hz film could never get that 'film look' of all blurred and confusing ;)
 
Back
Top