The Great Framerate Non-Debate

I watched/listened to most of it and I agree that 60FPS is always preferable to 30FPS in video games.

The problem is that consoles, especially the new ones, have very limited resources.
Sony and MS are trying to show games that look like a generational leap to get gamers to buy the consoles, because better looking graphics is one of the main attractions of new consoles(especially in the first few years).

30FPS allows them to keep 1080p, higher resolution textures, better AA, more NPC's, longer draw distances etc; Compare that to 60FPS where all these things can suffer large cuts; This make a large visual difference.

Better graphics are also immediately noticeable by average gamers when viewing screenshots, Youtube videos and just about all media of the game; Whereas 60FPS is not noticeable in most media content and is usually only noticeable when the player actually plays the game.

Having said that, I personally would be fine with sacrificing graphics to achieve 1080p/60fps in most games I play; Most developers will not share that opinion sadly as they know they will get more attention for being pretty.
 
Last edited by a moderator:
Recently I have gone through an interesting experience related to this topic. I got a 4k monitor for my PC and have been forced to choose between playing at 4k at around 30fps or playing at 1440p/1080p at 60 fps and in every case I have chosen to play at 4k because the higher visual quality of the image meant more to me than the higher frame rate. I think this is a scenario that most PC users have never faced since a high-end machine will play any game at 1080p/60 and the situation is purely hypothetical rather than a real decision about their gaming experience. Before I got the 4k monitor I think I would have been on the 60fps side but now I can see why most devs will choose 30fps and a higher quality visual image rather than weaker visuals and 60fps.

TB can give some interesting options but his PC bias is rather absurd sometimes. I think his opinion on this also shows is bias towards pure gameplay experiences (that rely less on visuals) and his dislike for narrative focused games (that tend to rely more on visuals).
 
Back when PC game developers actually pushed the systems it was pretty common for PC gamers to set everything to max even if that meant running the game at 20fps or less (the exception being multiplayer games).The whole 1080p/60fps only happened due to the demise of high-end PC developers.
 
As we know with Watchdogs, modded Skyrim, modded GTA, and Crysis1 graphics are very very important to alot of gamers. You who are reading this post may not feel that way but many people do, and that effects sales which are very important to developers.
 
Ruiadas your experience is slightly different from the one the console guys have they face a tiny bit of scaling to run at 1080 while you face an absolutely massive amount of scaling running a 1080 game on a 4k monitor

and being on the pc you surely have the option of 4k@60hz if you lower some settings (shadows/hbao/af/aa ect) I am always baffled by the "lowering resolution is always the first port of call to improve fps" mentality

My situation is similar to yours except I went eyefinity not 4k
 
Sure, obviously I take all the configuration options into account, I was just trying to communicate when I was faced with the real world decision of IQ vs framerate I sacrificed fps for better IQ. Looking the TB video, and a lot of the discussion I see on IQ vs framerate, I think people are approaching the topic having never been forced to make the decision of high fps vs better IQ so it is purely an imagination game for them. From that point it is way easier to say 'I'd never accept 30fps!' than it is when you are sitting there messing with graphic options in a game menu that effect your actual experience.
 

http://www.youtube.com/watch?v=eXJh9ut2hrc

TL;DR: "I think 60fps is better than 30fps and if you don't agree you're obviously either lying or stupid".

My take on the "issue": better framerate enhances gameplay but destroys immersion for photorealistic games, particularly during cutscenes.

Heh, I watched that when it was first posted and for the most part I agree with him 100%. I don't exactly agree with his vitriol that it is impossible for people to appreciate 30 FPS. Some people can, I'm sure. But, I'm certainly not one of them.

A game doesn't become any less "filmic" by being 60 FPS or even 120 FPS. But it sure does get a lot easier to control and a lot more immersive. Of course, if when you say filmic you mean stuttery/jerky/unresponsive/etc., then it's hard to argue with that.

While I can certainly "make do" with 30 FPS, it's never a completely enjoyable experience. The input is never as crisp or responsive. And as such the experience is always ruined in some way. Also the lack of fluidity (as in real life) of 30 FPS compared to 60 FPS or higher just makes the game world that much less believeable.

Yes, at 30 FPS, the graphics can still be impressive. Ryse is absolutely mind blowing when it comes to graphics presentation. But once you start playing it, the 30 FPS limit slams right into your face and your mind (lack of crisp controls). ALL 30 FPS games suffer from this.

Luckily I game mostly on PC so I have a choice. I can do things to maintain 60 FPS. And is almost all cases, even if I have to reduce some graphical settings the end result is a better perception of the game even if a pixel by pixel comparison shows the lower FPS version to have technically better visuals. That non-fluidity of movement that comes with 30 FPS, even when masked by heavy motion blur, just makes it less visually appealing and significantly less appealing from a control standpoint.

I'll regularly drop down from 2560x1600 (Windowed gameplay is fantastic). Or drop graphical settings that aren't terribly noticeable when playing the game. Super high quality SSAO looks fantastic, for example, but isn't distinguishable from lower levels when playing the game and it comes with a significant performance penalty.

I give HUGE kudo's to Square-Enix for giving multiple resolution options for Final Fantasy 14 on PS4. Same goes to the developers that ported Sacred 2 to the PS3. Give your customers an option to run at 60 fps or 30 fps.

I'd be willing to bet that if they did that, however, most people would run the game at 60 FPS after first admiring it in 30 FPS. Well any game that actually relies on presentation of motion sequences and/or somewhat quick responsiveness. But as he noted even something as non-action oriented as Civilization 5 looks so much better when played at 60 FPS than it does when played at 30 FPS. There is a fluidity of movement (panning the playing field, watching units move, etc.) that more closely matches reality that 30 FPS can never do.

Regards,
SB
 
I think any TV listed as supporting 24 Hz actually supports it natively (shows 24 Hz refresh in signal info). Of course devs aren't going to support it just as they won't 50 Hz because you can't be 100% sure users will get the right experience.

I did a check and I found many manufacturers to be very unclear in their specs. 120Hz ones are of course fine and you find most modern TVs to be 120Hz, but the 60Hz ones... no, unless specifically specified.

I think most modern TVs get it right...

Most modern TVs are 120Hz so ya they get it right. If we look at products that are 1-2 years old (which are TVs that people actually own), I find that if they didn't explicitly talk about 24p pulldown, they do not get it right.


Ya and we're hearing guys like TB assuming that dropping the rez will allow games to hit 60fps. They're obviously ignoring the fact that in many cases it's the CPU that is suffering and dropping rez can only help to a certain extent. It's understandable because they're running top of the line CPUs and it probably gives them a skewed perspective that GPU is all that matters.
 


From this:

kalltemp811krn.jpg

kalltemp10y2kvv.jpg


they conclude this:

"So 60Hz is enough to avoid visible flicker in any situation, and can be lower in controlled environments (e.g. movie theater or VR)"

I'm not so sure. While a 60Hz flicker is just rapid enough to be undetectable, a display must have a refresh rate of twice that since a 60Hz cycle consists of 1/120th of a second of darkness and 1/120th of a second of light.
 
Well in recent research flicker threshold is 80~85hz.
The link has a nice view on persistence and saccade movement.

Another issue IIRC, in 24pcinema mode is some cheap 120hz TVs, do 3:2 pulldown to 60hz then interpolate to 120hz. Which gives to many artefacts versus 5:5 conversion directly to 120hz

EDIT Look of HFR from HPA Charles Poynton Seminar.
 
Last edited by a moderator:
While I can certainly "make do" with 30 FPS, it's never a completely enjoyable experience. The input is never as crisp or responsive. And as such the experience is always ruined in some way. Also the lack of fluidity (as in real life) of 30 FPS compared to 60 FPS or higher just makes the game world that much less believeable.
This gets complicated with less precise control schemes IMO. Plenty of console games especially can be reasonably well acclimated-to at 30fps, to the point that the real impact on enjoyment is extremely minor. by contrast, I'd never say that about something like an M&K FPS, because responsiveness at 30fps is easily below said acclimation threshold.

Sometimes hopping into a Halo game can feel rather clunky at first after playing a 60fps console shooter, but after a minute or two it doesn't feel like all that significant of an issue to enjoyment (aside from circumstances with stutter or nasty spikes). It oddly felt like even less of an issue 5 years ago when I was switching between 30fps Halo and high-FPS PC shooters, probably because my muscle memory or whatever wasn't having the direct mapping issues that occur when switching from a 60fps dual-analog shooter to a 30fps dual-analog shooter.

The point being, if we're talking about a particularly sluggish console shooter, the actual experiential damage inflicted by the choice of 30fps may very well be practically negligible to a lot of people (even if the differences in framerate are very obvious to said people). If said people also mildly prefer the chop of 30fps for that art style, I don't find it hard to believe that (even all other things being equal) 30fps could be more enjoyable than 60fps, within that audience and that game. Even in the context of games, I know people who say they prefer the look of 30fps to that of 60fps.

//=====================

In any case, TB's argument isn't particularly great, largely because it never gets anywhere. His only direct response to the claim that 30fps is beneficial to the aesthetic is that framerate doesn't affect the aesthetic (I can understand that some people would claim that it would have a strictly beneficial impact on the aesthetic, but simply claiming that it has no impact makes me wonder what his definition of "aesthetic" is). And when he actually tries to explore that notion, his train of ideas looks like this:
1-Lower framerates seemingly can have a pleasing aesthetic impact in film.
2-This is saved because of film motion blur.
3-Motion blur in film isn't comparable to motion blur in games because <stuff about games being less responsive>.
He NEVER gets around to completing his argument regarding aesthetic impact. Yes, 30fps games are less responsive than 60fps games... but that is at best a fairly small partial response to the initial question about the overall aesthetic. He never really concludes an argument in direct response to the claim.

//=====================

But the issue in all this discussion that really annoys me (largely in reference to the GAF firestorm)? It's that these arguments against the "cinematic framerate" people are made by people claiming that their opinion is an objective fact. How do they go about this? Well, they make a bunch of objectively factual statements, detailing how motion appears smoother at 60fps, or how 60fps games are more responsive according to a variety of various measures from resolving detail to input lags.

That's all well and good, but the question of "which framerate is better" doesn't actually lay out a particular means of assessing an answer. It's a very nonspecific question. Which framerate is better according to what? The implied rubric that the "higher is always better" people tend to use, which puts value on smoother motion and higher responsiveness, is not laid out by the question; their answer is an objective response only to a qualified question that specifically calls for the application of this rubric.

But obviously these rubrics aren't called out of thin air; they're used because people think they're a good way of assessing "better" in terms of what is most enjoyable. After all, enjoyability is usually the basic reason we play video games. But as far as I can tell, you won't see someone justifying their choice of rubric in those terms, possibly because it questions their claim that their opinion is actually an objective fact; if the question is simply which is most enjoyable to you as a player, then even if all of humanity agreed that 60fps was always superior to 30fps (which it doesn't), it would still ultimately be a subjective response. In particular, in light of the typical arguments made of the "60fps is objectively better than 30fps" crowd, although the details on their rubric are all objectively-assessable, the choice of rubric and the value system that goes with is made via subjective preference. A person who prefers the look of 30fps could call forth their own rubric, and depending on how they cast the components of this rubric, it may very well be just as objectively-assessable as the other rubric; they would simply value smoothnesses of framerates in a different way (a way which prefers the lack of smoothness at 30fps over the smoothness of 60fps), and perhaps place less emphasis on the value of responsiveness.

I mostly tend to agree that, in the context of video games, 60fps is nigh-on strictly better than 30fps. I can't think of any situation where I'd take 30fps over 60fps, all other things being equal. But within the unqualified question of "which framerate is better," preferring 60fps is my opinion. It is not an objective fact that one is better, unless I qualify the question to use a more specific definition of "better".

I generally try not to be picky about things like this if there's going to be extreme unanimity with respect to the choice of rubric in a certain circumstance (i.e. "this battery that lasted for ten hours was 'objectively better' than this otherwise functionally identical battery that only lasted for two hours"). But this is an argument directed at an opposing viewpoint using a different rubric to assess "better"; it's trivially obvious that there isn't unanimity with respect to the choice of rubric. Defending your value system in these cases is perfectly reasonable, and choosing to ignore the small minority on this particular sort of issue is also arguably reasonable. Claiming that the opposing opinion is merely a goofy subjective preference in the face of your "objective fact" is silly.

I give HUGE kudo's to Square-Enix for giving multiple resolution options for Final Fantasy 14 on PS4. Same goes to the developers that ported Sacred 2 to the PS3. Give your customers an option to run at 60 fps or 30 fps.
Interesting that you and TB brought this up, since according to DF's analysis, the 720p mode still goes through the normal internal 1080p rendering process, and has nothing to do with trading IQ for performance. Their video suggests that framerates are basically identical between the game's two modes.

That said, there are console games out there that let you trade IQ and performance. The PS3 is notable for this, particularly with racing games. The Gran Turismo games are the common example, but not the only one.
 
Last edited by a moderator:
The consoles have limited resources, especially CPU power.

Thus, I believe that 60fps does not only mean sacrifice of graphic fidelity (like resolution), but also for instance the fidelity of physic simulation, destruction, number of NPCs, sound simulation, animation system, AI etc etc...these are all things that have a direct impact on the actual gameplay experience.

Hence, it can be concluded, that for certain (ambitious) games on the consoles, 60fps means at the end of the day a lesser gameplay experience in addition to lower graphical fidelity.

Imo, gameplay is not only defined by the responsiveness of the controller input like some people suggest in their posts.
 
To me higher rate first and foremost means different "playability" but not different gameplay.
I never saw differences in the gameplay, in the core mechanics, in the rule set, when a game runs at 60fps versus the same game at 30fps.
 
To me higher rate first and foremost means different "playability" but not different gameplay.
I never saw differences in the gameplay, in the core mechanics, in the rule set, when a game runs at 60fps versus the same game at 30fps.
In an extremely strict sense, varying degrees of feedback can be construed as varying ruleset. Sort of.

And to some degree it might depend on how you think of the word "gameplay." Many people would blend your notions of "playability" and "gameplay" into what they mean when they say "gameplay."
 
What about discussion on cpu limitation in achieving 60fps.

I wouldn't be surprised if many titles this generation aren't able to achieve 60fps not just because of rending quality, but also because of the cpu and the simulations/ai/physics etc running on the cpu. In this situation because of the consoles anemic cpu would Totalbiscuit want the devs to provide options to reduce simulation in order to achieve 60fps?
 
Last edited by a moderator:
In an extremely strict sense, varying degrees of feedback can be construed as varying ruleset. Sort of.

And to some degree it might depend on how you think of the word "gameplay." Many people would blend your notions of "playability" and "gameplay" into what they mean when they say "gameplay."

Let put it this way: In a game I can't simply equate "what you do" (gamepaly) to "how you do it" (playability) and yet I know that they are deeply connected and often indivisible
It's complicated actually...or rather I probably can't explain it appropriately :LOL:
 
Last edited by a moderator:
The consoles have limited resources, especially CPU power.

Thus, I believe that 60fps does not only mean sacrifice of graphic fidelity (like resolution), but also for instance the fidelity of physic simulation, destruction, number of NPCs, sound simulation, animation system, AI etc etc...these are all things that have a direct impact on the actual gameplay experience.
Of all the things you brought up, only the animation has to be tied to the framerate (even there you could run simpler iterations)
Everything else can run pretty much independently, most multiplayer games have server ticks from 10 to 20 iteration per second - thats the main game logic running there.
Its not hard to do the same thing for singleplayer.

Hence, it can be concluded, that for certain (ambitious) games on the consoles, 60fps means at the end of the day a lesser gameplay experience in addition to lower graphical fidelity.
The only thing that needs to be scaled up with framerate is the presentation. Other thing could be scaled up to improve gameplay

Imo, gameplay is not only defined by the responsiveness of the controller input like some people suggest in their posts.
Nope its not, but you can throw a game away if you need the responsiveness and dont have it.
 
Last edited by a moderator:
A game doesn't become any less "filmic" by being 60 FPS or even 120 FPS.
Hang on. It has to get more filmic the nearer to 24 fps it gets. That's one of the major properties of film that define it's perception.
But it sure does get a lot easier to control and a lot more immersive. Of course, if when you say filmic you mean stuttery/jerky...
Yes, because that's how film looks.
/unresponsive/etc.
No, because films don't have a quality of responsiveness.

The argument I used in the other thread - if you want to produce a game where someone walks into the room and sees you in front of the TV and asks what film you're watching instead of what game you're playing, you are going to want a lower framerate. Higher framerate immediately identifies the on-screen content as a game (unless you use frame interpolation on films). That's not to say 30/24 fps is in any way better for gameplay, but if a developer chooses the cinematic aesthetic as their priority, they have to go lower framerate. When HFR in cinema becomes commonplace, if ever, and everyone becomes accustomed to films being higher framerate, then filmic games can also be higher framerate. Until then, to become like a movie, games have to get as close as possible to a movie's qualities, which means similar optics and refresh, which also means all the negatives of cinema like jerky movement, difficulty following action, and messy (though also at times pretty) visual artefacts.

While I can certainly "make do" with 30 FPS, it's never a completely enjoyable experience. The input is never as crisp or responsive.
Input is usually 60 fps even if the update on shows every other frame of that.

There is a fluidity of movement (panning the playing field, watching units move, etc.) that more closely matches reality that 30 FPS can never do.
Absolutely, which is why higher framerates can't get close to the filmic look, because film isn't smooth. ;)
 
Back
Top