The Great Framerate Non-Debate

One is free to produce a game that cause nausea intentionally but the developer has to be transparent and responsible of the consequences.
Somewhat hyperbolic. The world isn't full of dizzy, spewing gamers struggling against their brain melting games. 30 fps (heck, we even drop down to lower framerates these days) can be played by and large. Yes, higher framerates are better for gameplay (no-one arguing against that), but you needn't be so dramatic about lower framerates. Unless you have found yourself excluded from the gaming world because you can't play low framerate games and you want the gaming industry to make their wares accessible to you? ;)
 
I was kind of with the guy in the video for a while until he said that reviewers should talk about framerate and that games should get a lower score for being 30fps. Reviewers do bring up framerate but only when necessary(Like a noticeable non-solid framerate). To say that games should score lower because they are running at 30fps instead of 60fps is totally moronic and sentiments like that are bad for the industry.
 
Funny thing is many successful games use LOD, HDR, blending animation systems to enhance quality based on the same rules (CSF) that ultimately push for high fps like Uncharted, GTA4, Crysis.

The understanding on these fundamental rules improves the budgeting of a project in terms of predictable problems, the need of motion blur, other blending systems to compensate the low fps or use this info to ease the computational burden to achieve a good presentation. In a way much is already done but in separate, collaborative but not integrated or well understood way.

I never saw viewing distance as a fundamental rule in budgeting a game project. Dreamworks used http://pdiff.sourceforge.net/ to optimize their rendering, but they were conservative and used a 90FoV. BBC in 2003 defined settings to provide film look even at higher fps digital cameras: Film look is not only jerky motion


Then there is gameplay, the most important feature post PS2 era was the graphics / input clock separation on game engines. That provided for input to run at 60hz in many 30fps games to bring latency down. But the elapsed time feedback still depends on Frames/s.

HFR and HDR compared:
https://www.youtube.com/watch?feature=player_detailpage&list=UUA2R1Ru-1mvh-VKQfrgXb_g&v=fueYodX8Vkc]

Personally and friends fell annoyed at 30fps, but i at least want the discard of 60fps to be well understood and defined in a game project. Maybe in a few years high fps is the norm or at least no project fly without a hard look on 3D-CSF implications.
 
That TotalBiscuit video is a bunch of elitist PC gamer drivel masquerading as a legitimate (but actually quite pointless) discussion.
 
Recently I have gone through an interesting experience related to this topic. I got a 4k monitor for my PC and have been forced to choose between playing at 4k at around 30fps or playing at 1440p/1080p at 60 fps and in every case I have chosen to play at 4k because the higher visual quality of the image meant more to me than the higher frame rate.

What kind of game were you playing? How close were you sitting towards the screen? 4k monitor is all good and nice at the end of the day, but considering most of us console players play in a livingroom (it being hooked up to a TV), we are likely to be sitting further away from the screen to effectively realize and fully appreciate the difference of a 4k to fullHD, hence why so few people are actually vocal about resolution differences between Xbox One and PS4 games - or still getting sub resolutions (692p last gen, 900p this gen).

At typical viewing distances, the difference between 720p and 1080p is smaller than some people think, especially if the pixels are fast moving which would again benefit framerate over tiny details for the majority of games. If I am sitting in front of a monitor however, the more resolution comes into play, especially if I'm playing point-and-click games or games that move at a slower pace where maximum details would be beneficial. The further away from the screen you sit, the less pixels matter (since they're smaller).
 
I can offer some input from the photographic world where electronic viewfinders are starting to supplant optical viewfinders in system cameras.
It is notable that even old geezers in their 60s and 70s reject low frame rate EVFs. 45Hz is considered workable but also clearly annoying, 60Hz is acceptable to some depending on what they shoot, and 120Hz is considered OK. Many want OLED EVFs with higher frame rates still, 240Hz being the next step, but the main issue is read-out speed from the sensor, which given the increasing resolution of EVFs limits refresh rates. With new sensors being developed with 4K video recording in mind, this may change.

I have a personal theory that console gamers are like film enthusiasts in the sense that they have a positive emotional connection with low frame rates. As an ex-competitive Quake player on PC, I can't understand how low frame rates are considered acceptable in something as interactive as gaming, particularly in the first person perspective. I held on to a top CRT for a long time because 60Hz just wasn't acceptable.

Clearly, what you are used to is critical in how you value frame rate. I submit the photographic community as a reference group since they largely don't give a damn about the technical specs of their EVFs, but have the naked eye as their reference point as opposed to referring/comparing to other screen technologies. And it turns out their demands are way higher than those of computer users particularly if their subjects are mobile. I suspect VR technology will push frame rate expectations far beyond 30Hz as well, for similar reasons.
 
Funny thing is many successful games use LOD, HDR, blending animation systems to enhance quality based on the same rules (CSF) that ultimately push for high fps like Uncharted, GTA4, Crysis.

The understanding on these fundamental rules improves the budgeting of a project in terms of predictable problems, the need of motion blur, other blending systems to compensate the low fps or use this info to ease the computational burden to achieve a good presentation. In a way much is already done but in separate, collaborative but not integrated or well understood way.

BBC in 2003 defined settings to provide film look even at higher fps digital cameras: Film look is not only jerky motion .
Ummm, from that very same pdf.

BBC said:

3 Conclusions

It is now possible to get video cameras to mimic film performance in a convincing way. But several aspects of the video camera’s performance must be preset for this to happen, listed here in probable order of importance.
• Film motion (judder): operate the camera in progressive scan mode at the desired frame rate (25fps for television). The shutter should be about 50% to mimic the 180° of the normal film camera. The resulting performance gives motion judder that closely resembles that of film, but see below as well.
The BBC quote a need for 25 fps and 50% shutter for filmic look as the most important change to recording with video to match film. Ergo, applied ot computer games, for that cinema experience, a game should aim for 24 fps with motion blur equal to 50% of frame time. According to the experts at the BBC.

I'll add that their take on DOF goes against plenty of gamers who feel DOF shouldn't be present in games as it means blurring part of the image that you may choose to look at. That's exactly what film does, and so should be present in any game trying to recreate the film look, but it'll offend some gamers just as lower framerates will.

OT edit: And one last additional remark, that 2003 report is likely outdated now in it's recommendation, as everything is shot in digital and colour graded now. It can all be done in post. In fact, we have full creative freedom to develop styles not possible in native film or video, which is much more in keeping with CG's capabilities. Once we have cameras with depth, we can shoot everything sharply (within the limits of the optics) and apply DOF in post.
 
I get the reference to cinematic experiences... but, in all fairness; We don't play movies, we do games. This is one crucial difference. Sitting in a movie theater, you will never question the feel of anything because you are taking on a passive role entirely. Not to mention that the director or cinetogropher has complete control over how it's filmed to reduce the amount of judder you get, for example landscape pans etc. It's not quite the same with games, where you are effectively controlling a character, or a car, and being immersed to the point where you are actually connecting to what you're doing on screen.

Playing a game at 30fps would be like going through life, seeing through some fancy glasses that only show you an update every 30th of a second. How great would that be? Not.

Of course if you're just staring at stationary objects like trees or flowers, it could be displayed at 5fps without anyone caring. However doing the same while moving your head or viewing anything with half decent motion and you'll painfully notice the lack of information being displayed.

Also for the hundreth time; a 30fps game with 30 absolutely sharp still frames is not the equivilant to any video sourced film footage with a shutterspeed of 1/30th (assuming 30Hz footage). Anyone disagreeing should press pause on any movie where an object is moving at high speed to see how blurred it is, compared to doing the same in a game without motion blur where any similar object will be razor sharp in any of the frames. Yes, there are techniques that add motion blur, but it's not quite the same, is it? Unless you run the game at a much higher framerate and simulate shutter speeds, but then you might as well just output the higher framerate you have...
 
Mostly temporal aliasing as car whell looks backward spin are result of motion resolution, these are ignored.
 
There's no point arguing that low framerates aren't as smooth as high framerates because everyone's in agreement with that! This discussion comes down to a subjective appreciation of the value of games, whether one feels that games should be entirely about visual fluidity and responsiveness, or whether one feels that a visual aesthetic can be value in itself. For those who prefer the former, lower framerate games are not acceptable. For those who feel the latter, lower framerate games are acceptable. You can't argue that either side is wrong in their subjective appreciation of a medium. If someone prefers more eyecandy at lower framerates (or resolutions), that's their prerogative, and vice versa.

Developers have the choice who/what to target. They can aim for higher framerate and appeal more to those who prefer higher framerate and less to those who like eyecandy. They can target a more cinematic experience, all juddery and blurry (according to some's sentiments), and attract those who value that experience at the risk of alienating those who prefer a smooth, crisp game. But it's ultimately their choice, both developer and consumer. The idea that games of a certain framerate should be banned is as ludicrous as banning games lacking a particular graphical technique, or outlawing a style of music (too brash/too boring), or wanting the death of a particular type of TV programme, or feeling a style of art should be taboo.

Whatever happened to 'live and let live'? Why is it so hard to accept that different folk have different values and there's nothing wrong in them all having access to the content/experiences that they value (within moral/legal limitations, not to be discussed here!)?
 
I'm not argueing the subjective element. In fact, I've practically given up on that. What I'm effectively pointing out is that the argument about 'cinematic experience' of why one would prefer a lower framerate is a bit far fetched, given there are practically no games that attempt this. They end up at 30fps because they run out of resources (either by design choice or other), so dropping and locking the framerate to the next best (30fps) is the only viable solution. That 30fps we get, usually does not have any fancy motion blur to mimic any movie experience outthere - what we effectively get is an even more choppy experience than what we get to see in most theaters, despite them being at 24fps.

Look at it from this angle; How many movies are filmed from the first person perspective? It's quite rare you ever have that view at all, except for perhaps short sequences where the director wants to portait a specific point etc. Movies that are being filmed are well versed in the limitations of their own equipment - and use them to good effect at the same time. Which is precisely why movies get away with 24fps most of the time. Most scenes are filmed stationary and zoom effects are slow and controlled, just as are landscape pans etc. High speed action sequences (car chases) are the exception of course, but then too, you have the natural bluring of the footage that makes it less jerky and the viewer is usually focused on the action, rather then being immersed into being there and driving.

In games, we rarely get the cinematic experience by design, because you are controlling a character and anything other than a consistent viewing angle will make the gameplay difficult and inconsistent. Some games have attempted this to good effect (Resident Evil) with static backgrounds where the camera is placed for good effect, but in the majority of games, you have a moving camera that is fully controllable and highlights framerate shortcomings rather well.

I get that people are happy with 30fps if it means we get better graphics. But I don't quite buy the argument that in games, 30fps feels better for cinematic experiences.
 
My last comment on temporal was aimed to efficiency. a game designed at 30hz has a maximum temporal sampling limit. One can still surpass those but in doing so aliasing happens. Besides the signal theory limits , 3D-CSF helps in saving resources.

People are so free that even at great financial cost, personal visions are developed. you can pursue somthing even if starving. Some people will say that no game is more valued than one life, but he also free to choose so. when money exchange,budgets, time-to-market are priority so is efficiency. Most of all this generalized efficiency is my goal. It also depnds on HFR displays so i also push 144hz, low persistence displays. The stereoscopic 3d gaming display case is the same needs HFR, but one is free to use even 7fps.
Im not locking people, i just citing technical onsequences, and advances on technical side as the sub forum name suggests.
 
Last edited by a moderator:
Developers have the choice who/what to target. They can aim for higher framerate and appeal more to those who prefer higher framerate and less to those who like eye candy.
Only I don't think it has to do with targeting a specific audience, primarily.
I think cinematic rendering is done, because it can be done. Photorealistic rendering has been, and still is, a holy grail of sorts, particularly distant in real time. I think it is a natural extension of rendering history that we end up mimicking film, warts and all.

Because it makes no sense at all when my barbarian walks around a mountain and looks down into the valley having his naked eye vision disturbed by lens flare, apparently through a six-bladed aperture and at least eight lens surfaces, seemingly uncoated. Hmm, maybe a pre-war Tessar design, only the field of vie....What the Hell?

I've had surgery for cataracts on both eyes, and thus literally cannot focus my eyes at all. Typically I wear glasses that focus my eyes at infinity. Even so, everything from a meter on out to infinity is perceived as being sharp. The DOF effects of games bear no relation at all to how eyes work, at best they are a misrepresentation of photographic lenses. They have no place in a game simulating a character looking.

Motion blur is its own problem, and it's hilarious and sad both to see games adding distortion and chromatic aberrations at a time when both are largely cured in photography.

I think cinematic rendering is done because it is currently considered a cool thing to do.
It makes little to no sense for games, cut scenes being a possible exception.
 
a display must have a refresh rate of twice that since a 60Hz cycle consists of 1/120th of a second of darkness and 1/120th of a second of light.

But lcd's dont switch off between frames like crt's (a static image would be constantly on aka infinite refresh)
they go from one colour straight to another (response time)

Even so, everything from a meter on out to infinity is perceived as being sharp.
Wouldnt that mean that everything except infinity is out of focus ? so your eyes are good at seeing the edge of the visible universe but crap at everything else ?
 
But lcd's dont switch off between frames like crt's (a static image would be constantly on aka infinite refresh)
they go from one colour straight to another (response time)
Yeah; supposing that you don't have high-frequency backlight control, attaining even a simple 60Hz ~square-wave strobe on an LCD requires a 120Hz panel, since the panel doesn't intrinsically strobe at its refresh rate.

But you generally don't intentionally strobe things like that, and flicker fusion threshold isn't the exact issue with LCD refreshes (though it's related, sort of).
 
Wouldnt that mean that everything except infinity is out of focus ? so your eyes are good at seeing the edge of the visible universe but crap at everything else ?
Sort of. As far as focus in the human eye is concerned, "70 yards in front of you" and "infinity" are extremely close approximations of each other.
 
The notion that people prefer 30fps to 60fps just because they can have more eye candy is ridiculous when you consider that the same people also prefer 24fps in movies vs 48fps.

The real problem is this: while a higher framerate improves how you look at content, the content remains the same, and because of those viewing improvements all the little flaws that weren't apparent at 24/30fps are too obviously fake to ignore, destroying suspension of disbelief.

The problem applies to other video media such as cartoons. I remember when anime switched to being made digitally and all the camera pans were done at 60fps. All that new smoothness only accomplished one thing, allowing the brain to see the true nature of what it was watching: not an environment with depth but just a flat painting being scrolled around with some cutout drawings on top of it. It was AWFUL.

Fortunately the problem has since been corrected and camera animations are done at 24-30fps.
 
The real problem is this: while a higher framerate improves how you look at content, the content remains the same, and because of those viewing improvements all the little flaws that weren't apparent at 24/30fps are too obviously fake to ignore, destroying suspension of disbelief.

The Lord f The Rings was filmed in 24fps and yet I could spot quite easily the fake rocks, the fake trees, the fake water, the fake snow, the "fake" CG characters etc...the flaws in short.
My suspension of disbelief was "broken" even if it was shot in 24fps.

Really poor makeup, poor visual effects and/or practical effects and above all poor acting still are the main reasons why many movies look "fake".
 
Last edited by a moderator:
Everything is a trade off. I don't expect to not see compromises on fixed hardware. Its just a reality of game design and consoles in general.
 
Back
Top