Fixed framerate: 60 fps is enough

60 Hz max and the best IQ that can be done at that?

  • Just no

    Votes: 0 0.0%
  • 30 Hz minimum and higher IQ would be better

    Votes: 0 0.0%
  • I don't care

    Votes: 0 0.0%

  • Total voters
    226
DiGuru said:
Yes, but if you are going to use motion blur anyway, you won't see distinct copies anymore. Just an extended blur, broken up over multiple frames. And in that case, the only difference between high and very high framerates would be, that the distance covered per frame is smaller. The thing you see wouldn't change, as you are already compensating for it.
Sure, but any motion blur technique that you use will have its limitations. You still want to have high framerates to hide these limitations. For example, no matter how good your blurring technique is, it will break down when the player's eyes follow an object that is moving with respect to the screen (of course, it's no better to not blur in this case; the only remedy is very high framerates).

Beyond this, exactly at what framerate each individual motion blur algorithm breaks down will depend entirely upon the game scenario. For example, the stochastic algorithm is still based upon discreet frames, and thus it won't be able to hide the aliasing if there is too much motion from frame to frame. The "smudging" algorithm won't be able to handle paths of motion that curve, and thus things moving in circles will show problems first (not necessarily fans, but rather a ball on a string type scenario).
 
thop said:
But more seriously i think 60FPS is plenty, for the eye at least.
There is no limit to the improvement in visual effect from higher and higher framerate. It's not because our eyes can respond quickly, it's because they can't. When the framerate is much higher than our eyes' response, we see the multiple discreet frames blended together. We see "ghosts" of objects moving with respect to the screen.
 
I never quite figured out this by myself, so if there's someone who can enlighten me, please...

I used to play Doom3 with vsync on, with forced monitor refresh rate of 85Hz, and with gameplay FPS capped at 60fps. My framerate was usually quite stable at 60fps, and gameplay seemed quite smooth and nice.

But how the hell is this possible? Monitor refreshes at 85Hz and game at 60Hz? How do they manage to operate together? Let's say game is artificially capped at 60Hz but in reality would be fast enough to update at for example 100fps. How does the game manage to cap the refresh rate, and keep the screen refresh rate smooth? I would think this could easily lead into situation where in the duration of one second, game would first update screen 60 times in sync with the monitor refresh....and after that hold the same image for 25 more monitor refreshes, as the cap of 60 frames/second has been reached. That would be a pretty poor implementation :) but I cannot think of a practical way to do it better.

How is the refresh rate cap actually implemented?
 
jimpo said:
How is the refresh rate cap actually implemented?

That depends. In the simplest case without a cap, the game switches in a new frame to be displayed when it is ready, which might even happen half way down a frame. So you get the same frame repeatedly until the next one is ready. With a 60 Hz cap, you switch display buffers 60 times each second and wait with rendering the next frame until your first buffer is empty. And if you check the vertical retrace signal when switching them, you are in sync with the monitor refresh.

When you split the game into two separate parts, one that is running the game logic and making changes and one that is rendering them, you can render lots of frames while updating the interactions less often. In that case, the frames can be different from one another (with animated effects), but the position of the objects only changes when the game logic updates them.

And lastly, you have multiple frames in flight, that are worked upon by different stages, like the game, the driver and the monitor. The amount of those buffers and the frequency with which the game logic updates things is your latency. That is the time between moving the mouse or pressing a button and seeing a reaction to them on the screen.
 
jimpo said:
I used to play Doom3 with vsync on, with forced monitor refresh rate of 85Hz, and with gameplay FPS capped at 60fps. My framerate was usually quite stable at 60fps, and gameplay seemed quite smooth and nice.

But how the hell is this possible?
Two possibilities:

1. vsync wasn't actually on.
2. Triple buffering was enabled.
 
High fps along with high Hz hides lack of motion blur "for free". This could be replaced with motion blur and a camera tracking what you're looking at to blur the right spot though.
 
Chalnoth said:
jimpo said:
I used to play Doom3 with vsync on, with forced monitor refresh rate of 85Hz, and with gameplay FPS capped at 60fps. My framerate was usually quite stable at 60fps, and gameplay seemed quite smooth and nice.

But how the hell is this possible?
Two possibilities:

1. vsync wasn't actually on.
2. Triple buffering was enabled.
Probably none of the two.

I don't even understand why this should need explaining. One frame is displayed twice, then a bunch of frames are displayed once each, then one frame is displayed twice... and so on... as long as the gpu is able to render a frame in less than 1/85th of a second, you'll get a steady 60fps with vsync and double buffering with a 85hz refresh.
 
Vadi said:
High fps along with high Hz hides lack of motion blur "for free". This could be replaced with motion blur and a camera tracking what you're looking at to blur the right spot though.
Not even close.

Consider this:
Imagine an object takes one second to cross the screen at 640x480. In order to get motion blur "for free," that object would need to cross no more than one pixel each frame displayed. That would require 640 frames per second!
 
Thowllly said:
Probably none of the two.

I don't even understand why this should need explaining. One frame is displayed twice, then a bunch of frames are displayed once each, then one frame is displayed twice... and so on... as long as the gpu is able to render a frame in less than 1/85th of a second, you'll get a steady 60fps with vsync and double buffering with a 85hz refresh.
The framerate wouldn't be sitting at 60fps in this case, would it? This can't happen because the video card can't do anything if it is double buffering and waiting for the screen to refresh. If you look at framerate graphs with vsync enabled and triple buffering disabled, you'll notice that the framerate is always the screen's refresh divided by some positive integer.
 
jimpo said:
How is the refresh rate cap actually implemented?

Heh, alright I don't like how anyone else has explained it yet so will show you. But will use 80hz because it has a much smaller lowest common multiplier of 240 with 60. Now I'm showing first for a normal FPS limit of 60FPS.

V stands for the VSync
N For New Frame Drawn
X Allowed to draw a new frame
+ Means a new frame is being drawn
- Means buffer is halted
A Buffer A Visible
B Buffer B Visible

Since a frame takes 4 cycles of a 240 cycle model then a frame will consist of +++N and then if the buffer hasn't been swapped - will be following showing its waiting for a buffer swap before it can draw again.

Under a double buffer model buffer there is two buffers and you can only swap right before a V Sync.

Code:
V--V--V--V--V--V--V--V--    V-Syncs
X---X---X---X---X---X---    When can you draw a new frame
N+++--N+++--N+++--N+++--    Drawing a frame?
AAAAAABBBBBBAAAAAABBBBBB    What buffer is being shown?

You can see that you only get half the frame rate if the time it takes to draw a frame is less than 1/80th second

Now in Doom 3 now while it only renders a frame every 1/60th of a second but say your computer render each of those frames in 1/80th of a second.

Code:
V--V--V--V--V--V--V--V--     V-Syncs
X---X---X---X---X---X---     When can you draw a new frame
N++-N++--N++N++-N++--N++     Is frame being drawn
AAABBBBBBAAABBBAAAAAABBB     What buffer is being shown

Here we can see that for every 4 V Syncs 3 of them are new frames so since a refresh rate of 80hz with 3/4 new frames we get 60 new frames shown a second. Now these frames will be timed out differently so the perceived frame rate for 2 of those new frames 80 FPS and for the third is 40FPS since it took twice as long to get out. Now in general its kinda hard to detect that.

This is assuming btw Doom 3 decides if it can render not by has 1/60th of a second passed since starting the last frame but rather are we supposed to be drawing a frame or not (handled by having the msg to draw a new frame sent at the beginning of the draw cycle not at the end and dropping draw msgs if too many build up till it draws some of them).

Triple buffering helps basically do the same thing by eliminating the wait cycles for a drawing buffer but increases the latency of seeing that frame. Here below you can that with drawing at 60 FPS that buffer A is being shown twice as much as B and C. And we are able to see 3 new frames as well for every 4 V Syncs.
Code:
V--V--V--V--V--V--V--V--    V-Syncs
AAAAAABBBCCCAAAAAABBBCCC    What buffer is being shown?
DDDDDD--N+++DDDDDD--N+++    Drawing a frame on A?
N+++--DDD---N+++--DDD---    Drawing a frame on B?
----N+++-DDD----N+++-DDD    Drawing a frame on C?

Hopefully this explains the two different ways its possible to see 60fps smoothly with an 85Hz Refresh Rate.

Edit: BTW what I said implies either the game state is being updated faster than 60FPS or there is a buffer in the current games state being drawn that gets swapped after every frame (this would be your third buffer).

Edit 2: Its much more preferred I would say to buffer the game state versus buffer video frame since less of a performance penalty I would say (those frame buffers are pretty big and game states normally aren't a huge amount of the memory being used by the game).
 
60 fps should be minimum for action/racing games on consoles. On PC however most monitors should be capable of at least 120Hz or more, so why not more.

The upper threshold would be the point where the frame rate is so high that it would make us blind looking at it, if there is such a point, I want to know what it is.
 
Chalnoth said:
Thowllly said:
Probably none of the two.

I don't even understand why this should need explaining. One frame is displayed twice, then a bunch of frames are displayed once each, then one frame is displayed twice... and so on... as long as the gpu is able to render a frame in less than 1/85th of a second, you'll get a steady 60fps with vsync and double buffering with a 85hz refresh.
The framerate wouldn't be sitting at 60fps in this case, would it?

The average framerate would be 60fps.

This can't happen because the video card can't do anything if it is double buffering and waiting for the screen to refresh. If you look at framerate graphs with vsync enabled and triple buffering disabled, you'll notice that the framerate is always the screen's refresh divided by some positive integer.

When the game doesn't limit the framerate, then yes the framerate will (usually) be the screens refresh divided by some positive integer. But it doesn't work like that if the game limits the framerate artificially. If the gpu is capable of rendering at the full refresh rate, then you'll instead get a framerate equal to the framerate limit of the game.

This is pretty obvious if you just think about it, but I've also actually tested it. 85hz+db+vsync in D3 gives 60fps, when the framerate drops it drops directly from 60fps to 47fps. If D3 haden't been fps limited then it would have gone from 85 to 47. Anytime I would normally get 85 I get 60 instead.

You can't seriously think that it's impossible to get more than 47.5fps (with vsync, 85hz & db) even if gpu would be able to render the scene in less than, say, one millionth of a second?
 
And for those curious what the average FPS would be if game limited at 60 FPS but the game buffers the states for drawing.

fps.gif
 
Nice explanation, Cryect!



On another note, so far we have only discussed how many fps would satisfy us by itself. But, of course, there is a trade off between the fps and the image quality. As long as we assume that we can get the maximum fps anyway, we want to enable all eye candy we can get, use the highest resolution our monitor can support and the maximum amount of AF/AA our card (or the game) supports.

And as long as you have the best videocard and play a game that is made to run on a pretty old videocard as well, you can do so. But that changes if you turn things around: if you play a game with almost unlimited eye candy in exchange for the amount of frames, you need to find a setting in between.

Like, do I want transparant water or a higher resolution, or HDR lighting instead of anti-aliasing. Or more geometry and better bumps and normal mapping, or deeper fog, or more particle effects. And on and on.

For someone who likes to play multiplayer FPS, winning might require the very highest framerate that can be reached (if the game logic ramps up with the framerate and isn't limited by the server), or the lowest latency (to be able to pull that trigger just a fraction faster). But with most newer FPS, that won't be an issue anymore, except for very low frame rates.

And for playing something that allows that maximum eye-candy and isn't time critical (like EQ2 or such), you might opt for nicer effects and turn up your resolution and AF/AA until the game starts to slow down noticably. But you can only do that as long as the game is build with more options than you can enable at the same time.

So, if we turn the question around: how much and which effects and other improvements are you prepared to sacrifice to reach your preferred frame rate?
 
Thowllly said:
You can't seriously think that it's impossible to get more than 47.5fps (with vsync, 85hz & db) even if gpu would be able to render the scene in less than, say, one millionth of a second?
Actually, this means that the CPU is also rendering more quickly than 85 fps (i.e. it's taking less than 1/85th of a second to prepare the frame).

Regardless, for this to work well it requires that there be a decent buffer in the driver between the output from the CPU and what the GPU reads.
 
Cryect said:
jimpo said:
How is the refresh rate cap actually implemented?

Heh, alright I don't like how anyone else has explained it yet so will show you.

Whoah, some explanation. I am at work right now so my brain is sleeping, will have to read that again back at home :) Thanks a lot!
 
Hmm, a long time ago I wrote a little article called 30 vs 60 FPS. In it I gave some physiological information about the eye, how it perceived things, etc. etc. Too bad it had a lot of innaccuracies, but I think I was on the right track. Here are a few things that I have learned over the years:

The human eye is very, very sensitive. While Chalnoth says that it takes our eye 1/8th of a second to get a response, that isn't exactly true. It may take 1/8th of a second for our eye to move to a certain part of the screen and focus there, but we are still receiving tons of information even while the eye is in motion. The rods and cones can detect changes that are as little as one photon, and these rods and cones then transmit that information to the brain. So while the eye takes that time to move and refocus, it is still receiving information that our brain can process. For example, I may be staring at the center of my monitor, but I can still perceive my fingers moving on the keyboard, the post it notes on the edge of my monitor, and out of my extreme periphery I can see the bookshelves next to my desk. In this time, I am getting a constant stream of information from my eyes, as they essentialy receive an infinite amount of information from the universe (infinite in that light is both a particle and a wave- really confusing stuff if you ask me).

So, now that we have it out of the way that our eyes are not necessarily limiting us, lets go a little further. Chalnoth talks about motion blur, and he has some solid reasoning behind this. For computer though, if they are rendering scenes fast enough, and the monitor is refreshing itself fast enough, we will start to see some basic blur. The reason behind blur is that the eye cannot focus on fast moving objects. So that fast moving snowflake heading towards your cars headlights will give the blur effect (there are other things at play here also, such as the eye trying to adjust to a very bright object, and the effect of neurotransmitters still stimulating a nerve ending). The human eye cannot focus quickly enough, so we see this effect. In film, the effect can really be seen, but that is mostly due to the camera filming at 24 fps (which is really low). Though in a theater that 24 fps film is refreshed at 3x that, so we don't see any of the flicker that running at a really slow 24 refreshes would exhibit.

For the next step, lets go to Imax. This has a couple of really interesting features to it. First of all it is filmed at a full 60 fps, and then it is refreshed at either 120 Hz or 180 Hz (I am not sure which, but one of the two). The developers of Imax discovered two things: the first was that 24 fps caused massive artifacts in filming and then projecting onto the huge screen, the second was that at 60 Hz refresh the flashing that was experienced at the peripheral vision (due to the fact that it is made up almost entirely of very sensitive rods vs cones) made people nauseaus after a while. Once they increased the refresh rate, the nausea went away.

Ok, that was step two, lets take another example leading onto our conclusion. I belive ILM did an experiment with people on a roller coaster simulation. First they filmed the coaster ride at 60 fps, then at 120 fps. They then put a bunch of people in front of a screen and showed both films (being refreshed at 120 fps). Apparently, with several groups of people, the incidences of motion induced nausea were significantly higher for the 120 fps movie. It is theorized that the more fps seen gave a more visceral experience. Not highly scientific, but interesting nonetheless.

What I am leading up to is that they eye can detect even very small changes in the visual spectrum, so the higher the fps, the better that experience will be (and more realistic for the eye). Now, I believe 60 fps should be the minimum for a good overall experience (and this is game dependent in many ways), but essentially all applications would have improvements if they rendered at this speed. Now, 85 is a nice number because the frames can synch with the refresh rate (giving nearly flicker free enjoyment), all the while not exposing the visual artifacts of the monitor rendering one frame at the top of the refresh, and going to the next frame in the middle of that same refresh. This will give a good gaming experience for nearly everyone, and it is at a refresh rate that most monitors can handle at decent resolutions.

Now, my dream setup would be a monitor that could handle 1600x1200 at 120 Hz, and run the application at that resolution and framerate, and add in 4X AA and 8X AF to the mix. I think that would give the greatest experience that current technology could deliver. Unfortunately, a monitor like that is VERY expensive, and most games aim for the 60 fps mark in terms of delivering speed and quality.

Now, if future hardware could deliver a monitor that can do 240 Hz refresh, and video cards could render at high quality with AA at 240 fps, then I think that if you put the 120 and 240 computers side by side, we would still notice a difference between the two. Most likely though, we are starting to hit the point of diminishing returns. It isn't economically feasible to produce a monitor that could do 240 Hz at 1600x1200 at this time, though a SLI setup could possibly deliver such frames at 4X AA on an older title (but not on Doom 3 obviously... since it is locked at 60 fps).

Anyway, don't underestimate your visual subsystem's overall abilities. It is a very precise and sensitive group of structures. On the computer, higher FPS will always give a better illusion of continuity, so there really shouldn't be a cap at 60 fps on any application when it comes to rendering. I must admit I much prefer Counter Strike: Source running at 1024x768 at 120 Hz/ 120 FPS than I do Doom3 at 120 Hz/ 60 FPS (it is night and day when it comes to smoothness, and in a firefight with quick motions I find that I am far more accurate with CS than with Doom).

Just my simple observations here.

Edit: made some simple corrections.
 
JoshMST said:
The human eye is very, very sensitive. While Chalnoth says that it takes our eye 1/8th of a second to get a response, that isn't exactly true. It may take 1/8th of a second for our eye to move to a certain part of the screen and focus there, but we are still receiving tons of information even while the eye is in motion. The rods and cones can detect changes that are as little as one photon, and these rods and cones then transmit that information to the brain. So while the eye takes that time to move and refocus, it is still receiving information that our brain can process.
No, definitely not. The 1/8th of a second is not the time it takes the eye to move and refocus. It is the time (approximately: it varies depending upon brightness and color) that it takes one rod to "expose" and send its signal to the brain. The reason why you can detect changes faster than this is that the rods and cones are not synchronized, such that you're always getting updated data from your eyes to your brain. This means that you can detect quick motion in shorter time, but can't resolve detail until you focus for about 1/8th of a second (note: rods dominate our vision for anything we are focusing on).

It also means that many frames of animation are summed up in our eyes.

So, now that we have it out of the way that our eyes are not necessarily limiting us, lets go a little further. Chalnoth talks about motion blur, and he has some solid reasoning behind this. For computer though, if they are rendering scenes fast enough, and the monitor is refreshing itself fast enough, we will start to see some basic blur.
No. You will see no blur. You will just see more frames. The frames themselves are still completely discreet. This is akin to temporal aliasing. If you want to get rid of this aliasing, you actually have to do something to break it up, not just increase the number of frames.

the second was that at 60 Hz refresh the flashing that was experienced at the peripheral vision (due to the fact that it is made up almost entirely of very sensitive rods vs cones) made people nauseaus after a while. Once they increased the refresh rate, the nausea went away.
This is because the rods (if I remember my terminology correctly: I'm talking about whichever the sensors in the eye are that detect mostly differences in brightness) are mostly at the periphery of vision, and respond much more quickly than cones. This is what allows the flicker to become visible.

So, I suppose to talk about the human visual system, you do have to talk about its two different parts: the rods and cones. The rods respond to brightness, and respond very quickly. These dominate our peripheral vision and our vision in the dark. Cones respond to color, much more slowly, and dominate the vision of objects we focus upon. They therefore are the dominant form of vision that is important when talking about games. Rods are more important for flicker, cones for the image that is displayed.

But regardless, this is not important. The important thing is that rendering multiple discreet frames adds aliasing. The only way to break up that aliasing is not to render discreet frames, i.e. motion blur.
 
Chalnoth said:
That's an understatement. It takes the eye about 1/8th of a second to get a full image.
That is a misunderstanding. Temporal summation in the visual systems indeed happens in a 1/8-second window, but that is a different thing.

1. Our eyes naturally follow objects that we're focusing on.
True.

2. Quick movements will take less than that 1/8th of a second to register because we don't need to wait for everything in our vision to update
Any visual stimulus needs less than that to be registered. You are stuck with the erroneus idea that the window of temporal integration is the time needed to form a visual perception.

The reason for the fact that moving objects do not look blurred, despite the 1/8-s temporal integration, is that the brain analyses form and motion independently.
 
Back
Top