120hz technology in future consoles?

There's a problem with that...that's lag.

Let's say you have frame1 and frame2 generated by the console. If your TV or whatever going to interpolate a frame between frame1 and frame2, let's call it iframeA.

So that would mean the game has already generated frame1 and frame2, while your TV interpolate between frame1 and frame2 and is displaying iframeA. As you can see, the game is a frame or two ahead.
1 or 2 frames of video delay, which amounts to ~1/15th of a second, isn't really that significant in today's LCD/HDTV terms. You see some HDTVs with video delay/lag of up to a full second, and in those cases it's really noticeable and affects gameplay. 1 or 2 frames of delay is probably a best-case number, though.

Still, there's plenty of reasons why I, and many in the industry, still aesthetically prefer 24 fps for movies. And why I also prefer 30 fps for games. More does NOT equal better here, to my eyes.
 
Still, there's plenty of reasons why I, and many in the industry, still aesthetically prefer 24 fps for movies. And why I also prefer 30 fps for games. More does NOT equal better here, to my eyes.

I would be interested in a list of those reasons.
<Not being sarcastic. Genuinely interested.>
 
So now TVs have built in CPUs too? Sounds like all those different standards and format incompatibilities are a job for a really fast and low power consumption cpu.
 
I would be interested in a list of those reasons.
<Not being sarcastic. Genuinely interested.>

Same here.

Normally, I can't tell if a game is running 30 or 60 fps, but when it's a PC game where you can tweak the graphical quality and alter the frame rate, I prefer 60 because it feels faster. I'd like to hear it from someone who prefers 30 fps.
 
I would be interested in a list of those reasons.
<Not being sarcastic. Genuinely interested.>
There's a reason 24p (or 24 fps) is touted as a special, highly coveted feature now in some consumer and professional digital video cameras. Camera technology (and the movie/Television industry in general) isn't trending towards higher framerates like 60 fps, but towards the lower framerate standard of 24 fps. This is because a framerate of 24p looks more cinematic, more aesthetically pleasing, and more filmlike... whether it's valid or not... or good or not.

I find that games running at 30 fps look closer to film/cinema than 60 fps. More cinematic in feel, and more aesthetically pleasing. 24 fps, however, is a little too choppy without the natural motion blurring of film to interpolate the motion.

Have you ever wondered why EVERY movie released in theaters today... EVERY Blu-Ray... almost EVERY network TV show... they all run at only 24 fps? Why do you think that is?
 
Last edited by a moderator:
Statix said:
I find that games running at 30 fps look closer to film/cinema than 60 fps.
No doubt, because cinema is running at that FPS in the first place. However, having seen a rare few clips of 60hz film/cinema, I disagree that 24 looks better on the same content.
 
Have you ever wondered why EVERY movie released in theaters today... EVERY Blu-Ray... almost EVERY network TV show... they all run at only 24 fps? Why do you think that is?

Why do you think it is? 24fps was settled upon many years ago as it's pretty much the lowest frame rate you can get away with. Fewer frames to deal with is as much about economies of scale than anything else. In the olden days I would guess it would be about factors such as the cost of film and maybe technological limits of projectors. In terms of recent movie-making, imagine how the budgets for CG would balloon if you had to render 60 frames per second rather than 24.

I suppose the reverse to your question would be, why do TVs display at 50Hz or 60Hz? My guess is that the reasoning is as far from the creative process as 24fps is for the cinematic one.

By the way HD cameras are slowly but surely experimenting with 60Hz@1080p. Temporal resolution is slowly being more accepted as being just as important as image resolution. I can foresee a time when digital cinema becomes more cost effective and then I can see 60Hz becoming much more of a factor when PCs are basically feeding the projectors.
 
Have you ever wondered why EVERY movie released in theaters today...
No. Sorry. Either Imax or Ominmax (or perhaps both) run at higher frame rates.

With standard 24fps film, there is lots of unpleasant temporal aliasing. (Jerky pans, wheels going backwards etc). There are technical (breaking/shredded film!) and cost reasons (more film!) for not going to higher frame rates for mainstream releases.
 
I suppose the reverse to your question would be, why do TVs display at 50Hz or 60Hz? My guess is that the reasoning is as far from the creative process as 24fps is for the cinematic one.
AFAIK the early standards were linked to the AC frequency of the power lines feeding the CRTs. 120Hz et al are just continuations of those archaic standards that have to accomodate backwards compatibility. No format is based on any research or consideration into what actually looks best, and people are so conditioned to certain styles now an honest, unbiased perception will be hard to come by. 60 Hz films will look different, but personally I hate the 24 fps stutter of the cinema and long for higher refresh rates! I imagine the actual optimum refresh rate will be linked to screen size and viewing distance; n degrees of motion across the retina per timeslice x is perceived as smooth.
 
Last weekend, I spoke to a friend who's involved in 3D TV technology using "real" 120Hz TV (not interpolated ones). He said it's amazing and hoped it would become mainstream. It's not his company's primary business but an up and coming direction/trend. I believe Sharp is working on something similar for Blu-ray too.

EDIT: Hmm... may be he meant 2 x 60Hz images. The topic came up when we were talking about 120Hz TV.
 
An interesting observation I've had while quizzing people who are watching motion interpolating TVs in stores: People have been unconsciously conditioned over time to associate higher frame rate TV with lower production value. The most common response I get when I ask them what they think is "It make the show feel cheap somehow." That's a reasonable response given that most people are not consciously aware of frame rate and given that over their entire lives there has been a pretty consistent correllation:
24Hz : Big budget, Hollywood movies
30Hz : TV sitcoms and drama
60Hz : Live sports, live action reporting, amateur handycams

It will be interesting to see if this unconscious bias kills motion interpolation or if tech companies striving to sell ever more expensive stuff will manage to overcome it.
 
24FPs material is shot on film.
60Hz material is shot on video. Until the advent of tripple CCD digital cameras colour reproduction and contrast was vastly inferior to film.

So while you had the higher framerate you had inferior image quality in each individual frame.

The problem is solved on the production side of things, I'd say. You can get a state of the art HD camera today for less than $40,000.

The problem is on the reproduction side. Colour reproduction and contrast deteriorate as the update frequency increases for both LCD and PDP TVs. So again higher frequencies result in a more washed out picture.

Cheers
 
Last edited by a moderator:
No. Sorry. Either Imax or Ominmax (or perhaps both) run at higher frame rates.

With standard 24fps film, there is lots of unpleasant temporal aliasing. (Jerky pans, wheels going backwards etc). There are technical (breaking/shredded film!) and cost reasons (more film!) for not going to higher frame rates for mainstream releases.
Why do you think it is? 24fps was settled upon many years ago as it's pretty much the lowest frame rate you can get away with. Fewer frames to deal with is as much about economies of scale than anything else. In the olden days I would guess it would be about factors such as the cost of film and maybe technological limits of projectors.
Wrong, wrong, wrong, WRONG. If the cost of additional film were really the issue in warding movie studios from going to higher framerates (which it's NOT)... You people really believe that blockbuster movies with budgets of near HALF A BILLION DOLLARS are worried about the material cost of film? If the cost of film stock required to shoot a film were such an issue, then how do you explain all the movies that were shot DIGITALLY, with digital video cameras, still being 24hz? How do you explain that the big-budget Superman Returns, shot digitally with NO PHYSICAL FILM, is still only 24hz? As well as MANY other big-budget movies that were shot with digital cameras. How also, do you explain that the latest state-of-the-art digital camera, the RED ONE, designed to be the end-all, be-all of professional filmmaker cameras with a maximum resolution that dwarfs the HDTV standard of 1080p, was designed to be shot at 24hz, but it's ALSO capable of up to 120hz or 120 fps... YET some of the greatest, most esteemed filmmakers we have today (e.g., Peter Jackson, Steven Soderbergh, George Lucas) who have shot major films with the state-of-the-art RED ONE digital camera have done so using the 24hz mode, when they could've easily used the 60hz or 120hz mode if they so chose to. How do you explain that?

24hz or 24p is an ARTISTIC consideration and decision made by filmmakers. That's a FACT. Moviegoers and filmmakers alike are accustomed to the look and feel of the motion of a 24hz film; this is a long-held aesthetic preference that has led to 24 fps being the longstanding, unshakable standard for everything you see on TV and in theaters today. I've found that Videogamers are generally SO oblivious and naive to this fact; they all think that "more is better" or "bigger numbers are better" on an absolute basis. This is just not the case for me, and MANY others.

In terms of recent movie-making, imagine how the budgets for CG would balloon if you had to render 60 frames per second rather than 24.
CG films are one thing, but what about all the movies that are based in realism and contain ZERO CGI whatsoever?

I suppose the reverse to your question would be, why do TVs display at 50Hz or 60Hz? My guess is that the reasoning is as far from the creative process as 24fps is for the cinematic one.
Let me tell you why: To support the occasional video material that IS 60hz or 50hz, such as some documentary and news programs, and sports programming. EVERYTHING else you see is either 30fps or 24fps. That's nearly EVERY primetime television program (except SNL) and EVERY movie/Blu-Ray/DVD. Period.

By the way HD cameras are slowly but surely experimenting with 60Hz@1080p. Temporal resolution is slowly being more accepted as being just as important as image resolution. I can foresee a time when digital cinema becomes more cost effective and then I can see 60Hz becoming much more of a factor when PCs are basically feeding the projectors.
You really think that digital video cameras today aren't already capable of framerates that are far beyond 60hz? At resolutions greater than 1080p? 60hz is nothing already, from a technical standpoint, for cameras.

Here are some specs relating to the latest state-of-the-art line of RED digital HD cameras. As you can see, some of these cameras are capable of shooting at up to 120hz (120 fps), at resolutions that go FAR beyond 1080p.

http://reduser.net/forum/showpost.php?p=321241&postcount=28

Not that it matters in the least, because most filmmakers using any of these cameras will opt to stick with 24p (24hz), as a very conscious artistic and aesthetic decision.

No format is based on any research or consideration into what actually looks best, and people are so conditioned to certain styles now an honest, unbiased perception will be hard to come by. 60 Hz films will look different, but personally I hate the 24 fps stutter of the cinema and long for higher refresh rates!
Good luck convincing any filmmaker or aspiring filmmaker/student out there today of your preference. There may not be that much logic overall as to why 24p (24fps) is such an established preference for nearly everyone who would consider himself a filmgoer, but people all over the world have gotten very, very accustomed to the look and feel of 24p (24 fps)... right or wrong. In fact, this preference is SO deep-rooted that even 30 fps programming/video looks terrible to them.

I personally prefer the look and feel of 24p over 30p (or anything higher than that, for that matter) because it's further removed from reality... more surreal, and more "magical" if you will. In that sense, 24p is associated with the magic and experience of watching a brand new, high-quality movie. Whereas 30p (30 fps) or higher tends to look too close to reality/real life. People don't go to the movies to watch real life--that's my mindset and I think the mindset of everyone who feels the same way, whether it be on a conscious or subconscious level. This is coming from a longtime movie fan, of course, but like most other people, I'm just very much accustomed to seeing 24p motion for movies.
 
Last edited by a moderator:
How do you explain that the big-budget Superman Returns, shot digitally with NO PHYSICAL FILM, is still only 24hz?

Because 95% of all theatres still use film projectors and those run at 24fps. Even if you shoot it digitally you have to make film copies.

Cheers
 
Because 95% of all theatres still use film projectors and those run at 24fps. Even if you shoot it digitally you have to make film copies.

Cheers
Then they could just simply downsample/downconvert the framerate to 24p to compensate for those lower-end theater projectors. See, problem solved.

Directors and filmmakers with a vision for their dream project aren't choosing 24p just because there's a low-end, lowest common denominator to account for. That would just be asinine. I mean, if that were the case, then why stop there? Why not shoot their films in 720x480p since that's the resolution that most people will ultimately be watching their DVDs in anyway?

24p is an artistic consideration, period. Don't argue with me on this.

http://en.wikipedia.org/wiki/24p
 
Last edited by a moderator:
Directors and filmmakers with a vision for their dream project aren't choosing 24p just because there's a low-end, lowest common denominator to account for. That would just be asinine. I mean, why not shoot their films in 720x480p since that's the resolution that most people will ultimately be watching their DVDs in anyway?

90% of directors couldn't care less, they have people for all the techincal stuff. Cinematographers will care but that doesn't change the fact that most movies are shot on film @ 24fps because there is a huge investment in infrastructure/tool chain.

Cheers
 
90% of directors couldn't care less, they have people for all the techincal stuff. Cinematographers will care but that doesn't change the fact that most movies are shot on film @ 24fps because there is a huge investment in infrastructure/tool chain.

Cheers
There's a huge investment in the industry infrastructure for 24p because that's the format that the industry, and audience, has chosen. There's also a huge, established infrastructure supporting for framerates such as 30p and 60p, but those are reserved for news programs, game shows, sports, and other live programming, for a reason.

Bottom line is that if you suggest to any aspiring filmmaker or film student out there to use 60p instead of 24p, they're apt to laugh at you outright. There's a reason for all the rage that was the Panasonic DVX digital camera, one of the first prosumer-level cameras that amateurs could reasonably afford to offer 24p (24 fps) as a special feature.

http://en.wikipedia.org/wiki/24p

Did you read it ? That article is an excellent case against 24fps.

Cheers
The goal of Wikipedia or any other so-called encyclopedic source, is to provide information and pose all the various different viewpoints on a subject matter. It is not to "make a case against X thing." If an wiki article is "making an excellent case against 24p," then that means the entry is flawed and biased. The rule of the site is for contributes to adopt an objective, "no point-of-view" tone and approach for every subject as a whole.

I did read that wiki page, and for good reason it poses some arguments that exist out there both for and against 24p. It's called being thorough and objective. The entry as a whole actually struck me as being mildly favorable toward 24p overall.

EDIT: Besides, I'm NOT making the case either FOR or against 24p. I'm not here to sing it's praises for everyone else, and to shove it down everyone's throats. You don't like it, fine--I can't tell you what to like or not to like. All I'm doing is providing information and telling it how it is. And how it is is that 24p or 24 fps is a conscious aesthetic choice by the vast majority of filmmakers. That's all I'm trying to convey here.
 
Last edited by a moderator:
Back
Top