No. Sorry. Either Imax or Ominmax (or perhaps both) run at higher frame rates.
With standard 24fps film, there is lots of unpleasant temporal aliasing. (Jerky pans, wheels going backwards etc). There are technical (breaking/shredded film!) and cost reasons (more film!) for not going to higher frame rates for mainstream releases.
Why do you think it is? 24fps was settled upon many years ago as it's pretty much the lowest frame rate you can get away with. Fewer frames to deal with is as much about economies of scale than anything else. In the olden days I would guess it would be about factors such as the cost of film and maybe technological limits of projectors.
Wrong, wrong, wrong, WRONG. If the cost of additional film were really the issue in warding movie studios from going to higher framerates (which it's NOT)... You people really believe that blockbuster movies with budgets of near HALF A BILLION DOLLARS are worried about the material cost of film? If the cost of film stock required to shoot a film were such an issue, then how do you explain all the movies that were shot DIGITALLY, with digital video cameras, still being 24hz? How do you explain that the big-budget Superman Returns, shot digitally with NO PHYSICAL FILM, is still only 24hz? As well as MANY other big-budget movies that were shot with digital cameras. How also, do you explain that the latest state-of-the-art digital camera, the RED ONE, designed to be the end-all, be-all of professional filmmaker cameras with a maximum resolution that dwarfs the HDTV standard of 1080p, was designed to be shot at 24hz, but it's ALSO capable of up to 120hz or 120 fps... YET some of the greatest, most esteemed filmmakers we have today (e.g., Peter Jackson, Steven Soderbergh, George Lucas) who have shot major films with the state-of-the-art RED ONE digital camera have done so using the 24hz mode, when they could've easily used the 60hz or 120hz mode if they so chose to. How do you explain that?
24hz or 24p is an ARTISTIC consideration and decision made by filmmakers. That's a FACT. Moviegoers and filmmakers alike are accustomed to the look and feel of the motion of a 24hz film; this is a long-held aesthetic preference that has led to 24 fps being the longstanding, unshakable standard for everything you see on TV and in theaters today. I've found that Videogamers are generally SO oblivious and naive to this fact; they all think that "more is better" or "bigger numbers are better" on an absolute basis. This is just not the case for me, and MANY others.
In terms of recent movie-making, imagine how the budgets for CG would balloon if you had to render 60 frames per second rather than 24.
CG films are one thing, but what about all the movies that are based in realism and contain ZERO CGI whatsoever?
I suppose the reverse to your question would be, why do TVs display at 50Hz or 60Hz? My guess is that the reasoning is as far from the creative process as 24fps is for the cinematic one.
Let me tell you why: To support the occasional video material that IS 60hz or 50hz, such as some documentary and news programs, and sports programming. EVERYTHING else you see is either 30fps or 24fps. That's nearly EVERY primetime television program (except SNL) and EVERY movie/Blu-Ray/DVD. Period.
By the way HD cameras are slowly but surely experimenting with 60Hz@1080p. Temporal resolution is slowly being more accepted as being just as important as image resolution. I can foresee a time when digital cinema becomes more cost effective and then I can see 60Hz becoming much more of a factor when PCs are basically feeding the projectors.
You really think that digital video cameras today aren't already capable of framerates that are far beyond 60hz? At resolutions greater than 1080p? 60hz is nothing already, from a technical standpoint, for cameras.
Here are some specs relating to the latest state-of-the-art line of RED digital HD cameras. As you can see, some of these cameras are capable of shooting at up to 120hz (120 fps), at resolutions that go FAR beyond 1080p.
http://reduser.net/forum/showpost.php?p=321241&postcount=28
Not that it matters in the least, because most filmmakers using any of these cameras will opt to stick with 24p (24hz), as a very conscious artistic and aesthetic decision.
No format is based on any research or consideration into what actually looks best, and people are so conditioned to certain styles now an honest, unbiased perception will be hard to come by. 60 Hz films will look different, but personally I hate the 24 fps stutter of the cinema and long for higher refresh rates!
Good luck convincing any filmmaker or aspiring filmmaker/student out there today of your preference. There may not be that much logic overall as to why 24p (24fps) is such an established preference for nearly everyone who would consider himself a filmgoer, but people all over the world have gotten very, very accustomed to the look and feel of 24p (24 fps)... right or wrong. In fact, this preference is SO deep-rooted that even 30 fps programming/video looks terrible to them.
I personally prefer the look and feel of 24p over 30p (or anything higher than that, for that matter) because it's further removed from reality... more surreal, and more "magical" if you will. In that sense, 24p is associated with the magic and experience of watching a brand new, high-quality movie. Whereas 30p (30 fps) or higher tends to look too close to reality/real life. People don't go to the movies to watch real life--that's my mindset and I think the mindset of everyone who feels the same way, whether it be on a conscious or subconscious level. This is coming from a longtime movie fan, of course, but like most other people, I'm just very much accustomed to seeing 24p motion for movies.