The Great Framerate Non-Debate

What I'm effectively pointing out is that the argument about 'cinematic experience' of why one would prefer a lower framerate is a bit far fetched, given there are practically no games that attempt this.
That may be true, but the catalyst for this discussion was a dev stating that's what they were going for, as I understand it.

Look at it from this angle; How many movies are filmed from the first person perspective? It's quite rare you ever have that view at all, except for perhaps short sequences where the director wants to portait a specific point etc.
That's for a reason with nothing to do with camera limits. Film is telling a story through the eyes of a third party observer, and deliberately shows events from multiple POV to build up the whole story. It's much easier to understand and relate to a person when you can see them instead of inhabiting their body. Even at 120 fps, a first-person movie would likely be a bit poop. ;) Although of course some have gone there too via the handycam movie, and cinema goers have enjoyed 24 fps horror. The low framerate probably helps with the confusion and visceral response.

I get that people are happy with 30fps if it means we get better graphics. But I don't quite buy the argument that in games, 30fps feels better for cinematic experiences.
It depends what one's definition of cinematic is. Clearly a game can't use funky, ever changing camera angles. However, a game can adopt a cinematic representation 'what if this was a real thing and a guy with a camera followed the protagonist around and filmed events'. That's where The Order can aim for the cinematic quality. Take a steampunk story in Victorian London and have the majority of camera work a follow-cam along with cut scenes. Make it look like you're watching a movie by including choppiness and blur and various post effects to simulate a camera lens recording the action instead of a human onlooker. The same look of The Order could be recreated in RL with actors, props, and a cameraman following the action in the same way. It probably wouldn't win awards for pacing or cinematography, but technically The Order would be cinematic in terms of what is required to produce a cinema film and optical look (even if not visual structure).

Only I don't think it has to do with targeting a specific audience, primarily.
I think cinematic rendering is done, because it can be done. Photorealistic rendering has been, and still is, a holy grail of sorts, particularly distant in real time. I think it is a natural extension of rendering history that we end up mimicking film, warts and all.

Because it makes no sense at all when my barbarian walks around a mountain and looks down into the valley having his naked eye vision disturbed by lens flare, apparently through a six-bladed aperture and at least eight lens surfaces, seemingly uncoated. Hmm, maybe a pre-war Tessar design, only the field of vie....What the Hell?
That's only true for first person games. For third person games, a camera following the action with a camera's limits is perfectly justifiable, and helps recreate on screen how live action would look is so captured. That's the aim of a lot of devs at the moment - to get what's on screen from the console looking as close as possible to what's on screen from the DVD/BRD/STB/airwaves.

Motion blur is its own problem, and it's hilarious and sad both to see games adding distortion and chromatic aberrations at a time when both are largely cured in photography.
Curiously I thought that for a while about CA and distortion, but it's not true. These still exist; devs just go over the top with it. The right amount of simulated lens limitations help create a sense of authenticity.

Edit: I'll add another example to the discussion. What if a game dev wants to recreate the aesthetic of an Aardman Animations claymation, such as with a Wallace and Grommit game? If they can produce the visuals in photorealistic quality, how would the framerate affect the player's visceral response? At HFR, it'd look not like clay animation but like little, real-life plasticine people. The only way to nail that Aardman look would be to cap the framerate to 12 fps. Then it would look just like an animation.
 
Wouldnt that mean that everything except infinity is out of focus ? so your eyes are good at seeing the edge of the visible universe but crap at everything else ?

I would guess that my eyes are actually adjusted to 3-5m or so, which makes everything from 1m to infinity appear sharp. (That is what is also typically done with glasses for nearsighted people who can focus their eyes, because overcorrecting could mean that a nearsighted person didn't relax their focusing muscle at any distance.) Simply put, humans aren't unaware of DOF limits just because the brain processes it away (which it does), but also because we have very wide angle lenses with pretty small apertures.

Shifty Geezer said:
That's only true for first person games. For third person games, a camera following the action with a camera's limits is perfectly justifiable, and helps recreate on screen how live action would look is so captured.
True enough. However personally, I tend to think about third person perspective as being "me" looking at the scene as opposed to "a camera" capturing the scene, and "me" looking at that captured footage. And if it is "me", well, then these lens effects are just as inappropriate as in a first person perspective.

Curiously I thought that for a while about CA and distortion, but it's not true. These still exist; devs just go over the top with it. The right amount of simulated lens limitations help create a sense of authenticity.
Both distortion and CA is kept quite low by current lens designs. But what is happening today in photography is that these aberrations are automatically corrected by either in-camera processing or by the RAW converter. This allows the lens designs to optimize for size/weight/price/resolution instead. Purists, of course, object. But all manufacturers, even Leica, do it today with their new gear. It just makes sense.
Longitudinal CA, common with teles at very large apertures, is typically not corrected, but then again is rather unusual and is not something that rendering tries to emulate.
 
60fps on a screen which supports black frame insertion is my new gaming nirvana. It's like being back on CRT again.

Given controller lag and display lag. The difference between 30 and 60 fps is trivial. That is a 16 ms difference. Controller and tv add a minimum of 100 ms.
 
Controller is 4ms if that (last-gen figure), and my TV 16ms ... The best lag on 60fps I believe was measured at 50ms on PS3, and 83.3 for 30fps. Good 60fps games manage 66ms, and most 30fps games are over 100ms.
 
Given controller lag and display lag. The difference between 30 and 60 fps is trivial. That is a 16 ms difference. Controller and tv add a minimum of 100 ms.
BS, see gamasutra test
push button on controller, see result onscreen = 50msec

1. controller -> 2. console process input -> 3. create framebuffer -> 4. send to TV -> 5. display result
1&2&3&4&5 = 50 msec together yet you say just 1&5 are minimum 100msec
I think you need to reassess your info

they sampled a few games, the results were
60fps game = 50msec best case (3 frames @ 1/60)
30fps game = 100msec best case (6 frames @ 1/60)
 
But lcd's dont switch off between frames like crt's (a static image would be constantly on aka infinite refresh)
they go from one colour straight to another (response time)

It doesn't matter. The point is that the eye can detect a change of pixel intensity that lasts for 1/120th of a second. This gives some credibility to those that claim there's an advantage to frame rates above 60 FPS.
 
The Lord f The Rings was filmed in 24fps and yet I could spot quite easily the fake rocks, the fake trees, the fake water, the fake snow, the "fake" CG characters etc...the flaws in short.
My suspension of disbelief was "broken" even if it was not shot in 24fps.

Really poor makeup, poor visual effects and/or practical effects and above all poor acting still are the main reasons why many movies look "fake".
Framerate isn't the only factor in decreasing suspension of disbelief, obviously, but when those defects you mentioned are on the screen it's far easier to spot them at 60fps.
 
Last edited by a moderator:
It's looking like it all comes down to whether you feel

Filmic - IE: stuttery/juddery/etc. is superior

Or

Realistic - IE: smooth motion is superior.

Personally I hate filmic even for old films who have no choice but to be "filmic." I go out of my way to find good 24/30 fps -> 60 fps video converter's on PC just to avoid that "filmic" look. It isn't always perfect. But it is always better than the high fps converter's in TVs. And it's a ton better than 24/30 fps source material, IMO.

Thank you, no more stutter/judder for me. BTW - I also don't view 24 FPS films in the theatre anymore because it annoys me so much.

It's the same reason I'll only play games at 30 fps if I have no other choice. Some genre's are more tolerable than others, but it's based on tolerance rather than it being a "good" experience.

It's also why I rarely game on consoles anymore. I detest 30 fps. I had hoped that PS4/XB1 would do away with 30 FPS, but it doesn't appear that will be the case with some developers choosing to go with 30 FPS just so they can claim best per frame graphics and F-U to smooth gameplay. I'm almost at the point where even if it's a game from a franchise I liked in the past, I'll be passing them by and not playing them if they are 30 fps.

Regards,
SB
 
Given controller lag and display lag. The difference between 30 and 60 fps is trivial. That is a 16 ms difference. Controller and tv add a minimum of 100 ms.
Not everyone uses TV's, I use a PC monitor with 2ms response time.
 
To be fair, low fps doesn't have to be stuttery/juddery.

It's looking like it all comes down to whether you feel

Filmic - IE: stuttery/juddery/etc. is superior

Or

Realistic - IE: smooth motion is superior.

Personally I hate filmic even for old films who have no choice but to be "filmic." I go out of my way to find good 24/30 fps -> 60 fps video converter's on PC just to avoid that "filmic" look. It isn't always perfect. But it is always better than the high fps converter's in TVs. And it's a ton better than 24/30 fps source material, IMO.

Thank you, no more stutter/judder for me. BTW - I also don't view 24 FPS films in the theatre anymore because it annoys me so much.

It's the same reason I'll only play games at 30 fps if I have no other choice. Some genre's are more tolerable than others, but it's based on tolerance rather than it being a "good" experience.

It's also why I rarely game on consoles anymore. I detest 30 fps. I had hoped that PS4/XB1 would do away with 30 FPS, but it doesn't appear that will be the case with some developers choosing to go with 30 FPS just so they can claim best per frame graphics and F-U to smooth gameplay. I'm almost at the point where even if it's a game from a franchise I liked in the past, I'll be passing them by and not playing them if they are 30 fps.

Regards,
SB

You seem to handle Ryse and Dynasty warriors (together with all it's fps issues from waaaay back in PS2 days) perfectly well and that's just a few months ago, why suddenly all the hate aimed at 30 fps?




I'm sure there are and will be good examples where simply having 30 fps brings about a better presentation than 60fps. I think too many people are, as some have mentioned but many ignore, placing criteria purely on gameplay, and in this case 30fps>60fps will probably always be true. (although I personally think there may still be examples where this does not hold true)

However, presentation wise, it's totally different. What if the Devs are purposefully sacrificing gameplay for presentation?

Shadow of the Colossus had terrible fps when fighting the statues. The FPS was shitty as hell. It made gameplay deteriorate, without a doubt.
However, would it be a better presentation-wise if the FPS was 60 throughout the ENTIRE game? probably not.

This argument made me remember of a design that was implemented on Devil May Cry 4 as well as Monster Hunter series which stuck in my head.

When players did a hard-landing attack, the devs purposefully froze the screen to give an illusion of the attack landing "hard". For the players, freezing the frame certainly deteriorates game play as it deprives the player of additional information for a few frames, and surely is a source of unresponsiveness. But is this feature better from a presentation perspective? Most probably yes.

People are forgetting that in certain cases, less is indeed more.
 
Last edited by a moderator:
Not everyone uses TV's, I use a PC monitor with 2ms response time.
You're confusing response time and input lag.

"Response time" is a measurement of the time it takes for the visual elements on a panel to switch between colors. It technically contributes to input lag, but typically to an extremely minor extent. High response times can contribute to "LCD motion blur", although (especially when eye-tracking moving content) sample-and-hold effects can be more prevalent on modern displays.

I'm not sure how input lag measurements on various displays attempt to take into account image transmission and whatnot, but the lowest claims I've ever seen for flat panel monitor input lags are something like 8ms.

To be fair, low fps doesn't have to be stuttery/juddery.
That depends on how you define "stuttery." 30fps is low enough that even with typical film motion blur (i.e. 180-degree shutter), it can look somewhat choppy in fast motion. That doesn't imply "judder" in the sense of uneven frame times, but it's hardly perfectly smooth.
 
Framerate isn't the only factor in decreasing suspension of disbelief, obviously, but when those defects you mentioned are on the screen it's far easier to spot them at 60fps.

The same can be said for higher definition that also makes easier to spot defects, or rather the effects.
The Hobbit, because it's The Hobbit that spawned the 48fps controversy, IMO didn't succeed in "hiding" the effects in the first place which is why it looked so dang fake.

Also defects, or better imperfections, makes thing looks more believable and authentic because life it's full of imperfection so they actually are "useful" for movies...and not all movies get it.
 
The same can be said for higher definition that also makes easier to spot defects, or rather the effects.
The Hobbit, because it's The Hobbit that spawned the 48fps controversy, IMO didn't succeed in "hiding" the effects in the first place which is why it looked so dang fake.
Obviously 24fps doesn't hide the especially bad CGI, but it does largely eliminate the "it looks like actors walking around a set" effect.

The issue here may be one of articulation and interpretation. People say that low framerates make the film look more real, and high framerates look fake. It sounds absurd at a glance, but as an intuitive interpretation in the film-viewing experience it makes perfect sense; a perfect reproduction of the appearance of actors walking around a set is arguably counterproductive toward a good representation of some epic shenanigans in Middle Earth.
 
I'm sure there are and will be good examples where simply having 30 fps brings about a better presentation than 60fps. I think too many people are, as some have mentioned but many ignore, placing criteria purely on gameplay, and in this case 30fps>60fps will probably always be true. (although I personally think there may still be examples where this does not hold true).

Of course we have a technical reality to deal with. Higher frame rate => less graphical effects/frame.
And since this is true, there is a trade-off to be made. I don't think there is much of a discussion to be had there.

"Cinematic" rendering however, is another issue entirely.
The defining property of games (computer or physical) is interactivity. It is OK for games to be inspired by film, but what defines good cinema/film is visual storytelling and compelling scripts and acting. But emulating that requires skill/money/time and it is so much easier to evoke "cinematic" by running a piece of shader code. So rather than being inspired by the qualities that can make film great (whether or not that can be transplanted into good gaming is another question), they emulate visual artifacts of film production in the bygone century, which adding insult to injury, costs rendering capabilities that could either have gone into a more compelling setting or better frame rates. It's similar to adding hiss, crackle, and phase errors to digital audio because people have positive memories of listening to music on LPs, but worse, because games aren't film. The artifacts are completely alien to the medium.

As I said before, I believe the main reason for cinematic rendering is that it finally can be done, and it is considered cool. It is the natural progression of a decades long trend, beyond the point where it makes sense any more. I hope games eventually reject this and build on their own qualities and heritage.
 
Obviously 24fps doesn't hide the especially bad CGI, but it does largely eliminate the "it looks like actors walking around a set" effect.

True but what if filming in 48fps requires to use the camera, the special effects and so on in a completely different/new way to deliver the right results?
We can't doom 48fps because one director failed.

The issue here may be one of articulation and interpretation. People say that low framerates make the film look more real, and high framerates look fake. It sounds absurd at a glance, but as an intuitive interpretation in the film-viewing experience it makes perfect sense; a perfect reproduction of the appearance of actors walking around a set is arguably counterproductive toward a good representation of some epic shenanigans in Middle Earth.

You have used two key words here: reproduction & representation.
 
True but what if filming in 48fps requires to use the camera, the special effects and so on in a completely different/new way to deliver the right results?
We can't doom 48fps because one director failed.

Whether Jackson failed or not is very much a matter of opinion.
To my ears, his detractors sound similar to those who opposed CDs 30 years ago. "The medium is too revealing of recording errors". Or even "It's not what I'm used to, and therefore wrong".

At the end of the day, the justification for 24fps lies in the mechanical limitations in cranking film a century ago. There is no question whatsoever that the low frame rate of film will eventually be a thing of the past. The technical reasons for low frame rates today are all about industry inertia, but inertia is a strong force. The process can easily take decades.
 
True but what if filming in 48fps requires to use the camera, the special effects and so on in a completely different/new way to deliver the right results?
Actually, I don't have any doubts that you could get 48fps looking much better than it does in Hobbit HFR. Case in point, I thought it looked bizarrely overly motion-blurred when I watched it. I learned after the fact that they used a 270-degree shutter as a compromise for the 24fps version. Yeah...

"It might work better if you alter your style" is hardly a convincing argument that 48fps is strictly superior to 24fps in film, though. Especially in a world where editors deliberately occasionally pull shenanigans like dropping framerate (i.e. to 12fps) to emphasize or surrealize moments.

You have used two key words here: reproduction & representation.
Yes, I noticed that as well when I wrote it.
 
Last edited by a moderator:
I’m of the view that framerate is particularly important for a shooters or anything that’s multiplayer driven. I’m a little upset that DriveClub is 30fps, but it’s not a deal breaker for me because the game looks great.

Contrary to popular opinion, I’d like to see more games using the technique used by GG on Shadowfall’s multiplayer to get higher fps (although at a higher average, not 40-50). If a game is created with 960x1080 in mind (I hope I’m remember SF’s resolution correctly) right from the start, you’re either going to have a game that potentially does near twice the graphical effects at 30fps, or near twice the framerate of a game that’s rendering at 1920x1080. The perceptible resolution drop isn’t SO bad, especially if you consider it’s less noticeable at standard viewing distances.
 
Last edited by a moderator:
Back
Top