What is with the fixation on 60fps *or* 30fps with consoles?

kyleb

Veteran
I'm at a loss to understand why console developers seem to focus on makeing their games run either 60fps or 30fps. I respect the fact that fast action games can look a bit choppy at 30fps and more framerate can overcome that, but from years of PC gameing I have found that anything over about 45fps is lost on me and I imagine I am not alone in that. So I am curious; why do devleopers fixate on either 60fps or 30fps instead of working to find a nice compromise inbetween?
 
to sync with the resfreshrate of your TV in interlacing (not with lcd, plasmas i believe)

and that's 25 or 50 for PAL (100hz just doubles it up )
 
Actually (repeating myself) PAL 60 Hz is quite common these days.

Both the Cube and Xbox 1 have switches for 50 or 60 Hz PAL (and older consoles before these supported it).

Only really old PAL sets don't accept 60 Hz AFAIK.
 
Last edited by a moderator:
pipo said:
Actually (repeating myself) PAL 60 Hz is quite common these days.

Both the Cube and Xbox 1 have switches for 50 or 60 Hz PAL.

Only really old PAL sets don't accept 60 Hz AFAIK.

Yes, that is true, but the guy is asking something else. :D
 
Seeing how this was split from another thread,

Phil said:
Just to add what London-Boy just replied: If your TV refreshes at 60 Hz and your game runs at 45 frames-per-second, the games and the tv refresh rate wouldn't be properly synched (in parallel), thus producing tearing.

If you're game however runs at 30 fps and your TV refreshes at 60 Hz, then in a simplified example, the TV would be receiving the update every second refresh, thus no fluctuations and as a result no tearing. A game running at 60 fps on a TV with a refresh rate of 60 Hz would be running in parallel - as a result, you wouldn't have any tearing either.

If you're game refreshes at 45 fps and your TV refreshes at 60 Hz, the fundimental problem is that they both start in parall (TV 1st refresh = 1st refresh of game), but from that point they'd be missing each other when the refreshes occur.

My example of using PAL refresh rates was an unlucky one - living in PAL territory I sometimes use the numbers while forgeting that the majority uses NTSC and the popular 30 / 60 fps. Hope that answers your question.

In conclusion, it's not a "fixation". It's a display/hardware issue. :smile:
 
But... but... but... this gen will work on PC monitors! :D So maybe they can unlock it for us PC monitor users... or not :cry:

I was reading up on the eye... seems the frames tend to blur at 24-30, but the "typical" eye can notice difference all the way up to 72fps. Yet motion blur, done correctly, can similate the sense of speed and fluidity lost at lower framerates.
 
You all seem to be saying the same thing, and none of it makes sense to me. Yes, I hate tearing, that is why I use vsync. That doesn't preclude me from running at 45fps on a 60hz display though. Yeah, it means one frame lasts for one refresh and the next frame lasts for 2 and back and for like that and it takes an extra framebuffer's worth of memory to do it, but it is still 45fps and it still looks smoother than 30fps.

Also, I have to contest this:
Acert93 said:
I was reading up on the eye... seems the frames tend to blur at 24-30, but the "typical" eye can notice difference all the way up to 72fps. Yet motion blur, done correctly, can similate the sense of speed and fluidity lost at lower framerates.
I have tested this with multiple friends and they didn't notice the difference between a game running at 40fps or 50fps let alone all the way up to 72fps, so I'm not sure how whoever wrote what you said came to that conclusion, but it doesn't jive with the experiments I have run.
 
kyleb said:
You all seem to be saying the same thing, and none of it makes sense to me. Yes, I hate tearing, that is why I use vsync. That doesn't preclude me from running at 45fps on a 60hz display though. Yeah, it means one frame lasts for one refresh and the next frame lasts for 2 and back and for like that and it takes an extra framebuffer's worth of memory to do it, but it is still 45fps and it still looks smoother than 30fps.

Also, I have to contest this:

I have tested this with multiple friends and they didn't notice the difference between a game running at 40fps or 50fps let alone all the way up to 72fps, so I'm not sure how whoever wrote what you said came to that conclusion, but it doesn't jive with the experiments I have run.

Well the problem so far is that the instances when games framerate dropped to 45fps, it was because vsynch was OFF. Therefore it showed tearing.

I don't think there is anything inherently wrong in running at 45fps with vsynch on (meaning the console would output a normal 60Hz but show certain frames twice). The only problem i might predict is a sort of precision issue - controls wise i mean - for fast games. If some frames are showed twice, we might incurr in control precision issues, though i really don't know for sure.
 
kyleb said:
I have tested this with multiple friends and they didn't notice the difference between a game running at 40fps or 50fps let alone all the way up to 72fps, so I'm not sure how whoever wrote what you said came to that conclusion, but it doesn't jive with the experiments I have run.

Every eye is different.

Also you do get a point where your eye cannot see more frames, but the sensation of fluidity is improved. The diffeence between 24 and 30 is significantly more than 50 and 62. Although both are 25%, and 50-to-62 represents 2x as many frames, your eye is more sensative at the lower end.

I don't doubt you cannot tell the difference. I know a lot of people who cannot. I can play games at lower frame rates (prefer higher though). I can tell the difference; I am extremely sensative to bouncy framerates. 70fps one moment, 45fps the next, back up and down... egad! I would take a solid 30fps over that. If the game allows me I tend to lock my fps so I don't have to deal with that much.
 
Yes, I understand that every eye is different. However, if "the "typical" eye can notice difference all the way up to 72fps" it seems unlikely that neither myself or any of the friends I have tested get anywhere close to that.
london-boy said:
Well the problem so far is that the instances when games framerate dropped to 45fps, it was because vsynch was OFF. Therefore it showed tearing.
But I'm talking about with vsync on, so no tearing. Like in Doom3 on the PC your framerate is capped at 60fps in gameplay, but you can still have "r_swapinterval 1" and run at 60fps even if your refresh rate is 85hz or whatever.
london-boy said:
I don't think there is anything inherently wrong in running at 45fps with vsynch on (meaning the console would output a normal 60Hz but show certain frames twice). The only problem i might predict is a sort of precision issue - controls wise i mean - for fast games. If some frames are showed twice, we might incurr in control precision issues, though i really don't know for sure.
I can't say I have ever seen such an issue.
 
kyleb said:
Yes, I understand that every eye is different. However, if "the "typical" eye can notice difference all the way up to 72fps" it seems unlikely that neither myself or any of the friends I have tested get anywhere close to that.

But I'm talking about with vsync on, so no tearing. Like in Doom3 on the PC your framerate is capped at 60fps in gameplay, but you can still have "r_swapinterval 1" and run at 60fps even if your refresh rate is 85hz or whatever.

I can't say I have ever seen such an issue.

Don't know, we should ask some devs why this "locked 45fps (or any different framerate than 30 or 60fps) thing" has never been used. There must be reasons.

Doom3 only runs at 60fps or lower doesnt it? Your monitor refresh rate doesn't matter. You're talking about the console itself outputting some frames twice. Bit different.
 
Yeah Doom3 is capped at 60fps, that is my point. If Doom3 can run without tearing at locked 60fps on a 85hz display, then it seems obvious that a console game could run without tearing locked at 45fps on a 60hz display; eh?

And yeah, as for asking devleopers, that is why I figured I should give this question it's own thread and hope some might drop in and share some info.
 
kyleb said:
Yeah Doom3 is capped at 60fps, that is my point. If Doom3 can run without tearing at locked 60fps on a 85hz display, then it seems obvious that a console game could run without tearing locked at 45fps on a 60hz display; eh?

And yeah, as for asking devleopers, that is why I figured I should give this question it's own thread and hope some might drop in and share some info.

I think it all comes down to PC monitors being much more flexible when it comes to different refresh rates than TVs. Monitors can show whatever resolutions they support at many different refresh rates. So maybe your monitor isn't actually refreshing at 85Hz when playing Doom3. Mine switches for example from 75 to 60 when i start a game up (NVIDIA drivers default the refresh to 60Hz when starting a game... still dont know how to overcome that).

With TVs, you have 60Hz and that's it, so anything different than a multiple of 30 will give you tearing.

I have a headache now.
 
kyleb said:
Yeah Doom3 is capped at 60fps, that is my point. If Doom3 can run without tearing at locked 60fps on a 85hz display, then it seems obvious that a console game could run without tearing locked at 45fps on a 60hz display; eh?

And yeah, as for asking devleopers, that is why I figured I should give this question it's own thread and hope some might drop in and share some info.

Basically they're simplifying.

You don't have to tear to have average framerates between 30 and 60, but instantaneously (for any given frame) it must take an exact sub multiple of 60Hz or it will tear.

so for example a game running at an average frame rate of 45fps would do the following.

XX.XX.XX.XX.XX.

Where X is a new frame and . is the previous one held for the next 60th.

IME patterns like the above look worse in motion than just clamping to 30fps.

On PC's devs have no control over monitor refresh or game framerates so they just let you deal with it. On console I have complete control of both.

Back in the 2D sprite days pretty much 60fps was the norm, since we moved to 3D it seems like 30fps has become the norm.
 
I see disagreements on various forums quite frequently regarding differing test results.
It should be noted that whenever the testing methodology differs, or you have a different test pool, you are likely to see a variation in the results.

Researchers will generally test from a large pool of specifically targeted individuals to provide the necessary diversity they need. And for acuity testing they are usually testing for the extremes.

Example- 500 various age, race, income, education, profession, and gender, are selected in adequate numbers. They are shown a black then a dark grey screen at specific frame rates. Then they are shown a white followed by a black screen, at the same specific frame rates. This might show how contrast and frame rate perception are detected without additional stimuli.

You were testing your friends- a small pool, and were looking at the results in the confines of playing a game. This is more like the black and grey test above, with minimal contrast shifting, but it ads motor skills, story interpretation, situational awareness, etc.
And would likely reduce there ability to judge frame rate differences even further.

In determining the importance of frame rates in games to you and your friends, which testing is more relevant to you.
 
kyleb said:
I have tested this with multiple friends and they didn't notice the difference between a game running at 40fps or 50fps let alone all the way up to 72fps, so I'm not sure how whoever wrote what you said came to that conclusion, but it doesn't jive with the experiments I have run.

An obvious example to show that eyes do perceive differences can be taken by looking at a TV that has a refresh rate of 50Hz opposed to 100Hz. The latter is "flicker free" - which is why we even have 100Hz TVs to beginn with (I presume you have them in NTSC regions as well as 120Hz TVs :?: )

There's not a single person I know that can't tell the difference between the flickering 50Hz TV and the 100Hz "flicker-free" ones. You can run this experiment with any monitor where you can change the refresh-rate.
 
Flicker would be an example of an extreme contrast shift. Which is important when setting hardware refresh rates- but is not likely to be as applicable to software screen refreshes.
 
Of course, the point was merely that the eye does perceive differences (may they be through contrast shifts or motion) between 50Hz and i.e. 75+ Hz.
 
Back
Top