Fixed framerate: 60 fps is enough

60 Hz max and the best IQ that can be done at that?

  • Just no

    Votes: 0 0.0%
  • 30 Hz minimum and higher IQ would be better

    Votes: 0 0.0%
  • I don't care

    Votes: 0 0.0%

  • Total voters
    226
AFAICS, smudging out triangles as a motion blur technique has serious correctness issues: if you have 2 objects, with the 1st object staying behing the 2nd object all the time, and both objects moving in the same direction on the screen, smudging object 2 may allow object 1 to incorrectly shine through, for some rather odd rendering artifacts.
 
yeh 60fps is noticeable. If I'm playing AI then i can get away with < 60 but if I'm playing 15 yr old skool kids online who will chant "0wn3d" if they kill me then definitlly 70+ :LOL:
 
arjan de lumens said:
AFAICS, smudging out triangles as a motion blur technique has serious correctness issues: if you have 2 objects, with the 1st object staying behing the 2nd object all the time, and both objects moving in the same direction on the screen, smudging object 2 may allow object 1 to incorrectly shine through, for some rather odd rendering artifacts.
Well, it depends upon how you write to the z-buffer with the "smudged" triangle. Depending upon how you choose to do it, you'll probably get these artifacts roughly half the time. But the point is that these problems should be pretty small if the framerate remains fairly high.
 
One more for "depends". For competitive multiplayer shooters, more is definitely needed. For other types of games, 60 is probably just fine.
 
arjan de lumens said:
Padman said:
Reality doesn't have motion blur.....
Not by itself, but the human eye adds a bit of motion blur on its own.
That's an understatement. It takes the eye about 1/8th of a second to get a full image. So, why don't things look blurry?

1. Our eyes naturally follow objects that we're focusing on.
2. Quick movements will take less than that 1/8th of a second to register because we don't need to wait for everything in our vision to update (cells in the eye are continually sending their signals).

So, if you want to see why nothing a computer can display is enough (without motion blur), just wave your hand between your face and a wall. Then wave it in front of the monitor. Note the difference.

What motion blur will get us, then, is the ability for a game to simulate properly what the real world would look like, provided that the player's eyes are always focused on the center of the screen. Unfortunately, there's no remedy for objects moving across the screen looking blurry, if the player chooses to focus on those objects. This is why we really need high framerates as well as motion blur for interactive games.
 
60 fps is fine for me. The only time I would say it would be too low is in the case of fast-paced multiplayer games (Quake, UT) were input polling is tied to frame-rate - in that instance more is always better (as it lowers latency).
 
Diplo said:
60 fps is fine for me. The only time I would say it would be too low is in the case of fast-paced multiplayer games (Quake, UT) were input polling is tied to frame-rate - in that instance more is always better (as it lowers latency).

Yes, but that is about to change. Like with Doom 3: even if you have 180 fps, only 60 frames will be unique, the game logic will be executed 60 times per second. And it should even be the case, that even if your framerate drops to 30, your latency should stay the same as at 180 fps. So the only difference should be visual.
 
Chalnoth said:
arjan de lumens said:
Padman said:
Reality doesn't have motion blur.....
Not by itself, but the human eye adds a bit of motion blur on its own.
That's an understatement. It takes the eye about 1/8th of a second to get a full image. So, why don't things look blurry?

1. Our eyes naturally follow objects that we're focusing on.
2. Quick movements will take less than that 1/8th of a second to register because we don't need to wait for everything in our vision to update (cells in the eye are continually sending their signals).

So, if you want to see why nothing a computer can display is enough (without motion blur), just wave your hand between your face and a wall. Then wave it in front of the monitor. Note the difference.

What motion blur will get us, then, is the ability for a game to simulate properly what the real world would look like, provided that the player's eyes are always focused on the center of the screen. Unfortunately, there's no remedy for objects moving across the screen looking blurry, if the player chooses to focus on those objects. This is why we really need high framerates as well as motion blur for interactive games.

Theoretically, this sounds perfect. But can we really notice the difference in practice at high fps? And if so, up until what framerate? Your eyes aren't really all that fast for small movements, and have a chemical refresh as well. And if you are going to use motion blur anyway, there won't be any difference in the received motion picture between high speed and really very high speed.

So, at what fps do you think a further increase wouldn't be noticeable?
 
DiGuru said:
Theoretically, this sounds perfect. But can we really notice the difference in practice at high fps? And if so, up until what framerate?
Try the hand waving example. Essentially, how we'll know the difference depends entirely upon what is being rendered. The problems that you'll notice with framerates not being high enough, provided they are well above the playable range, will be essentially just like any other form of aliasing. They'll be most noticeable when you're viewing an object moving with in a very periodic way, such as a spinning fan.

But, for an example that's more common, just try strafing near a corner. You should notice what looks essentially like a ghosting effect, provided your framerate is pretty high: instead of just one corner moving smoothly across your vision, you'll see multiple copies of that same corner. This is an artifact of our eyes summing up multiple frames, with each frame being taken at a discreet time. This has nothing to do with the speed of our eyes, really. The frames are already being rendered much faster than our eyes can see them. The problem is that they're still discreet.

So, to put it simply, no matter how high of a framerate you have, there will always be some situation where it'll be noticeable that you're still rendering discreet frames. The only way to stop this is to either break up the discreet frames (the stochastic rendering example I gave earlier), or use some method to not render discreet frames (the "smudging" motion blur I mentioned earlier).
 
Chalnoth said:
DiGuru said:
Theoretically, this sounds perfect. But can we really notice the difference in practice at high fps? And if so, up until what framerate?
Try the hand waving example. Essentially, how we'll know the difference depends entirely upon what is being rendered. The problems that you'll notice with framerates not being high enough, provided they are well above the playable range, will be essentially just like any other form of aliasing. They'll be most noticeable when you're viewing an object moving with in a very periodic way, such as a spinning fan.

But, for an example that's more common, just try strafing near a corner. You should notice what looks essentially like a ghosting effect, provided your framerate is pretty high: instead of just one corner moving smoothly across your vision, you'll see multiple copies of that same corner. This is an artifact of our eyes summing up multiple frames, with each frame being taken at a discreet time. This has nothing to do with the speed of our eyes, really. The frames are already being rendered much faster than our eyes can see them. The problem is that they're still discreet.

So, to put it simply, no matter how high of a framerate you have, there will always be some situation where it'll be noticeable that you're still rendering discreet frames. The only way to stop this is to either break up the discreet frames (the stochastic rendering example I gave earlier), or use some method to not render discreet frames (the "smudging" motion blur I mentioned earlier).

Yes, but if you are going to use motion blur anyway, you won't see distinct copies anymore. Just an extended blur, broken up over multiple frames. And in that case, the only difference between high and very high framerates would be, that the distance covered per frame is smaller. The thing you see wouldn't change, as you are already compensating for it.

There has to be a certain frame rate at which it isn't visible anymore, going beyond that does not improve it. You would introduce another choppyness by the amount of frames not being in sync with the screen refresh, so you would want that to be the same or an integer multiply. Otherwise your computed motion blur wouldn't track with the frames displayed.

And the motion blur is a compensation for the visual effect in the first place, so you could actually have a lower framerate than the minimum visible one, compared to when you wouldn't use it.

The latency is dependent on two things: the frequency of game state updates (processing the input) and the amount of intermediate screen buffers/processing. If you use free running game logic in a separate thread or process, it would make sense to add motion blur to cover the time in between the frames, but it wouldn't make a difference to add more frames, that would mostly just add more latency and duplicate frames, which would even be counterproductive and make the motion blur less effective.

Especially with multiplayer games, it is quite desirable to have the game logic run at a fixed speed. So motion blur would be a good thing to compensate for the distinct frames, but it makes no sense to go above that speed.
 
Be nice if ATI or anyone else with a hardware accumulator buffer would have a driver with say VSync w/ Motion Blur. In this mode the video card would by using a triple buffer mode where one is the current frame being calculated, one is the frame being showed, and one which is a running accumulator.

Ideally each of these buffers would be floating point and the RAMDAC would support dividing by a number then the card would just swap showing the current frame and the current accumulator on each VSync. Then each time the video card finishes drawing a frame it just adds it to the accumulator.

Of course this would be slower than just showing all the frames individually but considering how many old games this would work great for would be nice. Like its not hard to get 300+ fps in Quake 3 or older games so you should be able to easily have 2 to 3 frames blended for each frame at 85hz or 4 to 5 for those using LCD's at 60hz.

Edit: Also if there was a mode as well with this to add sub-pixel jitter to the rendering you would get additional AA as well.
 
DiGuru said:
So, at what fps do you think a further increase wouldn't be noticeable?

I've conducted quasi-scientific tests :) (single blind) on Q3 players, that unequivocally showed that 125 fps was not enough to be unnoticeable.

Actually, it's pretty easy to calculate - assume 90 degrees FOV, assume 1600 pixelhorizontal resolution, => how fast your angular motion can be before you exceed single pixel frame-to-frame. Change around the number of pixels allowed to jump to find different allowed angular velocities. This will show you that the frame rate necessary to avoid artifacting is high.

There's a difference between movies and games in whether motion blur can be used to alleviate the visual problems of limited frame rate. As Chalnoth pointed out, in games it is typically not possible to say what the player will track with his eyes. When an attacker sweeps in from the left, it is an terrible assumtion to make that the player will keep his eyes fixed at the center of the screen, and blur the attacker accordingly. Nor can the game assume we follow that attacker and that we won't instead follow attacker B sweeping in from the right. Motion blur, with trivial exceptions such as blurring the audience in racing games (where we can assume the player largely tracks the tarmac), just doesn't make sense in games.

PS. Limited framerate is a HUGE problem in motion pictures too. It is handled, sometimes skillfully, sometimes less so, simply by not letting any object of interest move quickly across the field of view.
I quote Martin Scorsese:
"As it is, filmmakers know that any pictorial
element that moves across the frame too briskly will
fragment into blurred, jagged, “strobingâ€￾ pieces. So we
have rules (frequently ignored) about how quickly any
given lens can be panned, or how quickly an object can be
allowed to travel from one side of the frame to the other
in order to prevent these motion distortions. We are
forced to “panâ€￾ moving objects (which keeps them stationary
in the frame) in order to prevent this strobing, or
accept these distortions and hope that sound effects will
carry the viewer’s suspension of disbelief past these
visual anomalies."
 
Entropy said:
DiGuru said:
So, at what fps do you think a further increase wouldn't be noticeable?

I've conducted quasi-scientific tests :) (single blind) on Q3 players, that unequivocally showed that 125 fps was not enough to be unnoticeable.

What monitor were you using capable of greater than 125hz? Can't think of any that support decent resolutions higher than 120hz and of course the lower the resolution the less FPS needed as you have shown.
 
Cryect said:
Entropy said:
DiGuru said:
So, at what fps do you think a further increase wouldn't be noticeable?

I've conducted quasi-scientific tests :) (single blind) on Q3 players, that unequivocally showed that 125 fps was not enough to be unnoticeable.

What monitor were you using capable of greater than 125hz? Can't think of any that support decent resolutions higher than 120hz and of course the lower the resolution the less FPS needed as you have shown.

Most higher end CRTs support refresh-rates in the 160-200 Hz region.
We used Sony and Mitsubishi 21" and 22" monitors.

The problem of the monitors not going much higher, and the inability of our systems at the time to really reliably produce rock-solid 200 fps (= averages of roughly 500 fps), was the reason we couldn't test further than 125 Hz with any quality. The tests as far as we could go were crystal clear though. 10 out of 10 all around.

OLED is a technology that may come to the resque in the reasonably near future. Note the response time of 0.01 milliseconds Sony OLED. But of course, that requires an interface capable of a lot higher bandwidth than the dismal DVI.
 
FPS is overrated. I played Wing Commander on the AMIGA with probably 5 FPS and had lots of fun ;)

But more seriously i think 60FPS is plenty, for the eye at least. I hear that some games feel better to some players with 100FPS or more, and i think they might be right. Makes no difference to me though. I'd certainly like to conduct a blind test some day, a game played at 5 different (random) framerates to check if those gamers really notice the difference, or if it's just some psychological trick when they can see the framerate in the upper right corner.
 
Back
Top