What is with the fixation on 60fps *or* 30fps with consoles?

ERP said:
Basically they're simplifying.

You don't have to tear to have average framerates between 30 and 60, but instantaneously (for any given frame) it must take an exact sub multiple of 60Hz or it will tear.

so for example a game running at an average frame rate of 45fps would do the following.

XX.XX.XX.XX.XX.

Where X is a new frame and . is the previous one held for the next 60th.

IME patterns like the above look worse in motion than just clamping to 30fps.

On PC's devs have no control over monitor refresh or game framerates so they just let you deal with it. On console I have complete control of both.

Back in the 2D sprite days pretty much 60fps was the norm, since we moved to 3D it seems like 30fps has become the norm.

Thanks ERP, that explains a lot and should answer kyleb's questions.
 
I agree- it is widely accepted that for long term use of a computer screen that 72Hz and above are best.


But I am hoping that research findings can be applied specifically to what they were intended. While in other cases, I would hope that they be presented as a possible factor to be looked at, but not categorically factual, due to the variation in circumstances.
 
As to the issue of 45 fps causing input issues, the input polling rate can be separate from the display rate.
Forza for example had 30fps display rate, and 120ips input polling rate.

This should hypothetically work well. And I suspect (don’t know for sure) that they did blind testing and found it to be an acceptable solution.

Yet the fact that vision is mostly a psychological function- hampers this when it should help it.
Example-If you know the game is 30 and you have a preference for 60 you are likely to be more sensitive to frame rate drop- and even if you weren’t you can carry enough bias with you into the “experimentâ€￾ game, that you would perceive that part of the game to be flawed regardless.

It should help because are minds average visual information anyways, and when we are playing a race game for illustration purposes, we are making inputs based on are minds eye.

When you are going into a turn, you project where you are going to be in the future, not where you are at the exact moment. So if the game input can be received at the speed at which your hands can input the information- 120 IPS- the game can adjust your virtual position accordingly- then the screen updates at 30 which should be sufficient to provide you with the information you will need to do the mental “picturingâ€￾ of the necessary future adjustments.

My apologies if this isn’t very clear-I will probably have to restate this…
 
Fickler of refresh clearly a whole different ballgame. What I am talking about is using Quake3 and binding the function keys to various settings of "cg_maxfps 30". "cg_maxfps 40" and so on, and then flicking between them and asking if it looks smoother or choppier when doing so. It seems to me that this would be most relevant to discussion of what framerate is necessary to get the impression of fluid motion in a videogame.

Back to my question:
ERP said:
Basically they're simplifying.

You don't have to tear to have average framerates between 30 and 60, but instantaneously (for any given frame) it must take an exact sub multiple of 60Hz or it will tear.

so for example a game running at an average frame rate of 45fps would do the following.

XX.XX.XX.XX.XX.

Where X is a new frame and . is the previous one held for the next 60th.
I understand this, that is what I was getting at when I said "it means one frame lasts for one refresh and the next frame lasts for 2 and back and for like that."
ERP said:
IME patterns like the above look worse in motion than just clamping to 30fps.
I can't argue with your experience, but I can't say mine has been the same either. I suppose I should try to test this on my friends and see what their opinions are.
 
You can see fluid motion at 30fps on an interlaced screen. People have been watching racing for years at that rate. –keep in mind that’s broadcast, not a game, with natural blurring that occurs with what has been a mostly analog production chain.
 
SirTendeth said:
It should help because are minds average visual information anyways, and when we are playing a race game for illustration purposes, we are making inputs based on are minds eye.
That makes sense as to why 45fps seems smoother than 30fps to me even with vsync reguardless of refresh rate, I'm curious why this wouldn't be the case for others.
 
Within games, the data(not screen)refresh rate has two predominant things that I can think of. Data Updates, and Cohesive Immersion.

With Data Updates, the only thing that is relevant to your mind, is what has changed in a scène and what has stayed the same. This is context sensitive.
Example 1- If you are in a flight simulation, and you are on the runway just looking at a side display map- it doesn’t matter to you if the screen is showing this static information at 30 or 60fps.

Example 2- In the same flight sim if you are on an bombing run in the sky, you would want to watch the cross hairs to drop the bomb effectively, but still be aware of your surrounding in case of enemy aircraft. You want the information updates to match the display device. If the scene stays relatively static you devote more attention to the cross hairs, but the moment an enemy appears in the distance you want that to be relayed on screen, so that you can divert your attention to the new threat if necessary.

Data updates from the machine to the screen and then to you eye can never exceed the refresh of the display device. Internally the machine may be changing system states at far greater than what your display can handle- Sucks for you. But unless it is output on screen or through other less relevant to this discussion, audio or rumble cues, you can’t react directly to it. We can only process the information that we are given and use indirect reactions –interpolation- to try to compensate for perceived holes in that information.

One key piece that I think gets overlooked is that the game world e.g. game state changes are occurring at incredible speeds. And because of this we can falsely assume that we play first person shooters, better at 120fps on a display that refreshes at 60hz. When in reality, it’s that we are playing better because the frames have more applicable frame variation, in relation to the game state.

This is long so I will continue with Cohesive Immersion in my next post.
 
SirTendeth said:
You can see fluid motion at 30fps on an interlaced screen. People have been watching racing for years at that rate. –keep in mind that’s broadcast, not a game, with natural blurring that occurs with what has been a mostly analog production chain.

That's because it is (a) actually 60Hz (field rate) )and (b) the TV cameras are doing some amount of temporal filtering which you don't (usually) get in 3D games.
 
Phil said:
At what frequency is your monitor updating when you lock the framerate to 45 fps?
I'm not sure I am getting the intent of your question, but it would be updateing at 45hz in that case. Granted, it would still be refreshing at 60hz for instance; but in such a case it would simply be displaying the same information as the last refresh for 15 of those refreshes during each second.
 
kyleb said:
I can't argue with your experience, but I can't say mine has been the same either. I suppose I should try to test this on my friends and see what their opinions are.
ERP is correct.

If there is motion in the scene, the eye/brain will track that motion and try to predict where it will be. If you drop into X XX X XX pattern, then the motion will not be at regular locations because on the repeated frames their will be no motion at all. It stands out like the proverbial injured opposable digit.

This is also why animation on an LCD monitors don't look as smooth as a CRT. The LCD holds each frame constant for (virtually all) of the frame period, whereas a CRT flashes it very briefly. The eye thus sees the LCD motion as stop-start-stop-start. One approcah is to use a flashing backlight to overcome this. (there is some stuff on the net if you want to google eg: http://slashdot.org/article.pl?sid=05/07/27/1536250&tid=129&tid=1)
 
Last edited by a moderator:
SimonF- That's pretty much what I said.

"You can see fluid motion at 30fps on an interlaced-(read 60 fields)- screen. People have been watching racing for years at that rate. –keep in mind that’s broadcast, not a game, with natural blurring-( Temporal Filtering)- that occurs with what has been a mostly analog production chain."

But I don't mind the clarification, because it will help people understand that just becuase something is loosely relevant doesn't make it an absolute, due to the difference in variables.
 
I think Phil may be interested in your refresh rate to, see if it would provide a better subdivided cadence at 40fps synced, than at 30fps synced.
 
LCD and CRT just Highlight extreme contrast again.

LCD is on off, with nearly no percetible fade, where a CRT lights the phospers of an area including that outside of the designated target area and then it fades. it's the predominatly the fading that provides a virtual subframe that smooths out our perception of the image.
 
Phil said:
An obvious example to show that eyes do perceive differences can be taken by looking at a TV that has a refresh rate of 50Hz opposed to 100Hz. The latter is "flicker free" - which is why we even have 100Hz TVs to beginn with (I presume you have them in NTSC regions as well as 120Hz TVs :?: )

One of the main reasons for wanting the 100Hz screens is because people want larger TVs and the flickering gets worse the larger the CRT TV is. It's all a problem of pumping enough power into the phosphors <shrug>

The downside, of course, is that the motion gets jerkier on 100hz telly (unless you put in a lot of CPU power)
 
SimonF, you keep bring up very interesting points that are keeping me from getting to my second key to game immersion.

Size of the screen is a big issue from a technical and a psychological viewpoint. From the technical side the refresh is a problem with CRT's because the draw from top to bottom doesn't occur instantaneously, this scanning process becomes more noticeable the larger you make the set, and the slower it scans.

I will be addressing the psychological side that impacts CRT's and any other display devise that hase a refresh rate in my post on Cohesive Immersion.
 
Simon F said:
ERP is correct.

If there is motion in the scene, the eye/brain will track that motion and try to predict where it will be. If you drop into X XX X XX pattern, then the motion will not be at regular locations because on the repeated frames their will be no motion at all. It stands out like the proverbial injured opposable digit.
Understood on the fact that every third refresh in such as will not show motion at 45fps displayed 60hz, but from my experance that is better than every other refresh not showing motion as would be the case at 30fps displayed at 60hz. I have a couple friends coming over tonight to test their preceptions of those situations and I will report back with the results.

SirTendeth said:
I think Phil may be interested in your refresh rate to, see if it would provide a better subdivided cadence at 40fps synced, than at 30fps synced.
The tests with Q3 I mentioned were years ago on a CRT at 85hz, but these days I tend to game on my plasma running at 60hz and still 45fps comes off as smoother than 30fps.
 
SimonF the point that you made about 100hz being jerkier, and ERP post are related to the sync cadence, which leads into Cohesive Immersion.

We play games for various reasons, most of which involve immersion to meet those goals.

To take a virtual breather, the game needs to provide us with a world that has enough information or stimuli, to distract us from our current surroundings and the thoughts that go with it. This varies for each individual and the requirements for that individual Change as that individual evolves.

In an action game we are pulled out of the virtual experience when we feel we have inadequate information to be successful at our mental goal. Ironically we are likely to be more successful, the more we can use the innate skills we get from dealing with everyday life in the game, and for this to happen it requires our mind to perceive the game as being roughly the same as the real world counterpart to the situation.

When it comes to frame rate we will match it to reality which is analog. It match’s poorly. But most people take into account hardware limitations and so they proceed to treat the game as a near equivalent anyway. This leads to using motor and mental skills that we have developed in reality, which draws further into a state of immersion.

An individual may evolve a preference for a greater Screen refresh rate and or a grater Scene refresh rate, if they are exposed to such.

This can lead to difficulties in getting to a higher state of immersion.
One can also be drawn to varying degrees, out of immersion, when the frame rate stutters.
This can also happen when the game changes states with out notifying you by way of updating the screen. Or by Lag etc.

Uneven Sync Cadence –like xxaxxaxxa- in and of itself, is a non-congruent with reality, phenomenon. And so it can hamper an individual in being immersed, if they take note of the pattern.
 
kyleb, because cadence is predominantly a psychological deterrent to immersion, the preference will vary great among individuals. This is much like the preference of a particular storyline.

It typically won’t be a technical deterrent to your ability to play the game and in fact is likely to be the opposite. You personally prefer the updates made to the game world, to be relayed to the screen more frequently, for possibly technical, and definitely psychological reasons.

So for you the fact that the screen is static a unrealistic intervals is not very important in achieving the goal of immersion.
 
SirTendeth said:
You can see fluid motion at 30fps on an interlaced screen. People have been watching racing for years at that rate. –keep in mind that’s broadcast, not a game, with natural blurring that occurs with what has been a mostly analog production chain.

But people are not watching 30fps rate in the broadcast. It is effectively 60fps, since each broadcasted frame contains two different frames that are displayed to the viewer separately.
 
Back
Top