Why 30 or 60 Frames/Second?

oramay

Newcomer
I've been wondering why when people talk about frame rates, it's almost always either 30 or 60. What makes other values like 37 or 51 seemingly irrelevant?
 
I think its a matter of precedent. For years, decades even, televisions (and as such many early monitors) broadcast in a 60Hz range at around 30FPS. On monitors themselves 60Hz (usually the maximum Hz for a native resolution on earlier models) usually translates into 1 frame per Hz rather then a half a frame like on a tv since for computer systems the data is usually broadcast realtime. Im positive someone could explain it more in depth then that but im quite sure thats essentially where the 30/60FPS came from, especially when speaking of consoles where 30/60FPS numbers have become more predominate in game development discussions, since, again, for decades TVs displayed their media at a maximum of 30FPS at 60Hz. Its basically just become the accepted norm and as such what viewers will find comfortable, 30FPS is generally as low as you'd want to go or else it becomes apparent to the viewer/gamer. Of course you dont have to lock the framerate to the Hz (V-Sync) and contrary to what some may say higher framerates do indeed look more comfortable to watch even though the set cant physically display more then the maximum refresh rate but then you open yourself up to screen tearing which is a result of the framerate changing drastically very quickly and that can be a big distraction to people, especially if you happen to have a large TV.

edit-for some reason i thought i was in the PC gaming section, i've been tainted!
 
Last edited by a moderator:
The legacy of refresh rate limitations of the display device (i.e. your old fashioned TV).

Edit: Too late. I really should have managed that faster, considering SugarCoat gave a much more comprehensive answer...
 
Gears of War from what i heard is 45 fps. anyone dare to tackle?

i dont think its possible to impose a frame lock thats not in sync with a displays VBI (vertical blanking interval) if thats what you're getting at. As i already noted, its certainly possible to allow the framerate to run ahead of the refresh rate but thats usually unwise since you cant control it.
 
yep, this is legacy from old TV`s, because they use AC mains frequency for synchronization, therefore region`s with 220V/50Hz have 25fps, 110V/60Hz have 30fps

http://www.akky-jp.com/english/jijyou/index.html

http://en.wikipedia.org/wiki/Broadcast_television_systems#Frames

This is quite interesting.

Because even this 25/50 frames and 30/60 frames talk even goes into HD resolutions.

Like in America, ESPN HD uses 720p@60frames for their broadcasts...........while in UK and Continental Europe, BBC News 24 HD is broadcast using 1080i@50hz.

So it really is based on the the "TYPE" of electricity available in a given region/teritorry then???

Also, are their any devices out there that are UNIVERSAL (or can handle frames/hertz in multiples of 24, 25, 30, 50, 60 and even 100-120)?!?!?!? In short, FULL compatibility between PAL AND NTSC display standards???

Is the PS3 one of them (does it FULLY support BOTH PAL AND NTSC on both SD and HD resolutions)???

Thanks.
 
Most tv's can handle both these days if im right. I believe its more of a technical ''lock'' that makes playing ntsc/pal impossible than that its technicall impossible these days.
 
This is quite interesting.

Because even this 25/50 frames and 30/60 frames talk even goes into HD resolutions.

Like in America, ESPN HD uses 720p@60frames for their broadcasts...........while in UK and Continental Europe, BBC News 24 HD is broadcast using 1080i@50hz.

So it really is based on the the "TYPE" of electricity available in a given region/teritorry then???

Also, are their any devices out there that are UNIVERSAL (or can handle frames/hertz in multiples of 24, 25, 30, 50, 60 and even 100-120)?!?!?!? In short, FULL compatibility between PAL AND NTSC display standards???

Is the PS3 one of them (does it FULLY support BOTH PAL AND NTSC on both SD and HD resolutions)???

Thanks.

today it`s just legacy, even relatively old CRT TV`s can do both PAL and NTSC (or PAL60)

I don`t know why EU use HD resolution at 50Hz, maybe for better compatibility with old PAL stuff they have in archives. Higher fps is always better.

and yes, PS3 have no problems with NTSC/PAL DVD or with HD stuff at any standardized frequency (24/50/60).
 
I'm still coming across lots of PAL crts that can't handle NTSC (but can handle 60hz PAL just fine).
 
What would the benefit of, lets call it dynamic FPS range or something be? If a developer could aim for 35 or 40 fps would this affect how a console game is made and developed?

Perhaps most importantly, would it make a difference at all?
 
What would the benefit of, lets call it dynamic FPS range or something be? If a developer could aim for 35 or 40 fps would this affect how a console game is made and developed?

Perhaps most importantly, would it make a difference at all?

A lot of developers would happily target sub-60 framerates if they could, but the reality is that if you don't hit a number which is a factor of 60, then you either get tearing or a juddering frame rate (the eye/brain can be fooled to some degree by a constant framerate, even if it's quite low - but an update that is not smooth is noticeable).

So yes, if a display technology could cope with that, without being annoying for the end user (i.e. it didn't flicker) then it may well solve a few issues for developers.

Mind you I expect what would happen is that developers would just get a bit lazier about last minute optimisations :)
 
Gosh darned those lazy devs standing in the way of progress. Just like em, all of them!

Seriously speaking though i wonder how you would come about such tech.
 
Gosh darned those lazy devs standing in the way of progress. Just like em, all of them!

Seriously speaking though i wonder how you would come about such tech.

It wouldn't be too hard with most devices - ideally you'd set the output at a multiple of the input and repeat frames to make up the difference. If you wanted to get smart you could try interpolating the intermediate frames to give smoother motion, but that tends to have artefacts even on "simple" input stuff like a panning picture with text over the top - an average 3D video game with a semi-transparent HUD and lots of different motion would probably bring that to its knees.

Another problem would be the simple practical fact that not everyone has such a TV, so you'd need a standardised fallback (such as running at 30 I suppose... maybe not so bad).

But anyway this is kind of a dead-end... we want higher refresh rates in both display technology and output devices, not support for esoteric lower rates!
 
Gosh darned those lazy devs standing in the way of progress. Just like em, all of them!

Seriously speaking though i wonder how you would come about such tech.
If you've got progressive, persistent displays (eg. TFT), it's no problem. You buffer the input and write it to screen per refresh. With CRTs this isn't an option 'coz of flicker. Modern TVs are able to sync down to 24 Hz. Any inbetween could be possible if you could communicate the frame rate effectively. Perhaps a future system could involve a refresh signal from the image source to tell the screen to update?

40-5 fps smooth frame rate might be a nice compromise between detail and speed, but the complexity of display hardware that supports it and fall-back solutions is a major limit. Ideally devs should target 30 or 60 fps and make the games work perfectly with that. If they're hitting 45 fps, IMO they should decide to either add more stuff and slow it down, or reduce stuff and speed it up.
 
The reason for 50Hz Digital TV in PAL regions is quite simple. It's so existing content can be played back without the need for frame rate conversion.
 
30 fps is minimum for the human eye to see movement as fluid. 60 is "double as good" if you wish, no special purpose.
 
Back
Top