Fixed framerate: 60 fps is enough

60 Hz max and the best IQ that can be done at that?

  • Just no

    Votes: 0 0.0%
  • 30 Hz minimum and higher IQ would be better

    Votes: 0 0.0%
  • I don't care

    Votes: 0 0.0%

  • Total voters
    226
Sandwich said:
How would you notice, if all the feedback you get is 60 frames per second? You'll never see the other 140 frames, so how is that better?
I said you won't see 200 discrete (whole) frames per second. If you have v-sync disabled you will see multiple partial frames per monitor refresh.
 
Chalnoth said:
There is no integration going on here that would affect that one bit.

What would you consider the physics engine that is handling movement?

Bolloxoid said:
Sandwich said:
How would you notice, if all the feedback you get is 60 frames per second? You'll never see the other 140 frames, so how is that better?
I said you won't see 200 discrete (whole) frames per second. If you have v-sync disabled you will see multiple partial frames per monitor refresh.

Unfortunately that doesn't look very nice due to the tearing in fast objects.
 
Cryect said:
Chalnoth said:
There is no integration going on here that would affect that one bit.
What would you consider the physics engine that is handling movement?
Sure, but that's only going to affect the path that things follow. It's not going to affect how smoothly they follow those paths.

I said you won't see 200 discrete (whole) frames per second. If you have v-sync disabled you will see multiple partial frames per monitor refresh.
Unfortunately that doesn't look very nice due to the tearing in fast objects.
It also isn't a very good way of looking at it, as if you are looking at any relatively small region of the screen, that region is still going to be refreshing at 60Hz.
 
Chalnoth said:
Cryect said:
Chalnoth said:
There is no integration going on here that would affect that one bit.
What would you consider the physics engine that is handling movement?
Sure, but that's only going to affect the path that things follow. It's not going to affect how smoothly they follow those paths.

I do not think the physics engine would become significantly less accurate at 'only' 60 FPS when compared to 200.
I could see the problem for when a game would depend heavily on numerical methods for differential equations, but I would imagine that objects like bullets simply follow P1 = P0 + V*Dt. Straight forward stuff.

Sure vehicles may be slowed by drag (Fd=f(v)) and stuff like that, but still all relatively simple. As long as the game isn't crawling at 1 fps the physics engine should function ok.

What about user input? 60hz. Assume vsync. triple buffering. I think players can experience the latency of 17 ms between pressing the button and firing on screen. At 60fps the latency between doing and seeing is 17 to 33ms.
At 200fps(60hz) that latency would be 5 to 17ms, depending on when you press during vertical trace.
In theory the extra 140 frames should make smoothness appear twice as good. But in practice?
 
Sandwich said:
I do not think the physics engine would become significantly less accurate at 'only' 60 FPS when compared to 200.
...
There are examples of raw accuracy differencies, (Q3 jumping distances as mentioned above), and IIRC in CS there are weapons often firing faster than your updaterate, giving you different weapon recoil for different rates. However I think this is beside the point, as these are implementation details and not that common. What matters is game mechanics, where in some games the game state updaterate is very important, discussed below.
What about user input? 60hz. Assume vsync. triple buffering. I think players can experience the latency of 17 ms between pressing the button and firing on screen. At 60fps the latency between doing and seeing is 17 to 33ms.
At 200fps(60hz) that latency would be 5 to 17ms, depending on when you press during vertical trace.
In theory the extra 140 frames should make smoothness appear twice as good. But in practice?
The most obvious reason for how update rates affect game mechanics is the amount of latency added to reaction time. I would say this makes a small but not insignificant statistical difference in several games. For example, every CS vet knows a connection with 20ms latency gives a clear statistical advantage over a player on 40ms network latency. Likewise a client running at 100Hz throughout has a corresponding advantage of a 50Hz client, albeit the difference in apparent reaction time is only half of the other example (and assuming a 100Hz server). But there are several other of factors in CS dependant on client updaterate that under circumstances can add up. Let's say two opponents come into view of each other, both running. To avoid accuracy penalties built into the engine, they must both stop before shooting (unless it's close range in which case the penalties will be statistically acceptable). It was a while since I played so I can't say how long this time is (it takes a brief moment to stop), but I do know that on say a 50Hz client (or connection) it will be harder to time this correctly (due to temporal quantization) than on a 100Hz client, so to be sure to hit you will have a noticeable amount of latency added, on top of other latency differencies. And I think I could come up with a few more examples from CS, factors that sometimes could add up in latency.

Then theres the issue of just the display rate. While this has been discussed lots above, I don't think anyone has outright mentioned the effect display rate can have on reaction time, besides the "raw" latency. I instinctively think that the "biological" latency can be improved by higher diplay rate in the case of aiming in shooting games: with a higher rate, you can appreciate the motion vector of a target faster. I cant say I know much theory around this though.

Anyway I guess one of my main points actually is that I believe many non- or casual games dont appreciate how much difference some added milliseconds here and there makes for a really good player, something Entropy also has been on to above. As examples, if you watch a recording of a top CS match, the time taken actually aiming (between reaction time and shooting) are (IIRC) mostly 2-3, sometimes a single frame, at 100Hz. And the team captain of one of the top CS teams not that long ago (12-18mth?) said that CRTs was still superior to flat displays for their purposes. In other words, display and rendering rate, as well as client game, network and server rates, *can* make a noticeable difference, but not generally, and definitely not for all users.

Hope this doesn't sound too confused, it was a while since I discussed these things.
 
Looking at the poll, it seems that it is about 50/50. That is interesting, as I was really expecting the score here on B3D to be much in favor of the upper two options, with "I want to tune it myself" being a close third.
 
tobbe said:
For example, every CS vet knows a connection with 20ms latency gives a clear statistical advantage over a player on 40ms network latency.
Sure, but this is a totally different issue since the network delay is only the round-trip time for a ping. It's not the total delay that you see in the game, since you don't have infinite bandwidth. If lots of stuff changes in one frame, it can take a lot longer than 20ms or 40ms to get all that info updated.
 
Chalnoth said:
tobbe said:
For example, every CS vet knows a connection with 20ms latency gives a clear statistical advantage over a player on 40ms network latency.
Sure, but this is a totally different issue since the network delay is only the round-trip time for a ping. It's not the total delay that you see in the game, since you don't have infinite bandwidth. If lots of stuff changes in one frame, it can take a lot longer than 20ms or 40ms to get all that info updated.
For CS and serveral similar games, an update frame will fit into a single UDP packet, so bandwith does hardly affect latency in that way. Server updaterate does, but it's the same for all players.

In any case I only put that in for illustration: if a 20ms difference is noticeable, then a 10ms difference (the next example) could be noticeable.
 
tobbe said:
For CS and serveral similar games, an update frame will fit into a single UDP packet, so bandwith does hardly affect latency in that way. Server updaterate does, but it's the same for all players.
I'm not sure I believe that this is true all the time. It may be true for a fair portion of the gameplay, but every once in a while you're going to have to send a significant update to all players (for example, in a large firefight). That is what will be limited by bandwidth.

In any case I only put that in for illustration: if a 20ms difference is noticeable, then a 10ms difference (the next example) could be noticeable.
There's no logical connection between those two things.
 
Chalnoth said:
tobbe said:
For CS and serveral similar games, an update frame will fit into a single UDP packet, so bandwith does hardly affect latency in that way. Server updaterate does, but it's the same for all players.
I'm not sure I believe that this is true all the time. It may be true for a fair portion of the gameplay, but every once in a while you're going to have to send a significant update to all players (for example, in a large firefight). That is what will be limited by bandwidth.
For each player you need something like pos[3] and view[3], and a few other bits (fire, crouch, death, weapon switch/pickup/drop), and a few extra bytes for your own player (poss reduction of health/armour etc). With say 24 bits for each pos/view component that's about 20 bytes per player. A really large firefight in CS would be 10 players. ~200 bytes/downstream update. That's with all 10 in view btw, because the server does a decent view cull for players to complicate for zbuffer-type cheats.

The bandwith is a ballpark figure, I don't have the protocol details, but there definitely aren't any big synchronization updates going on.
 
tobbe said:
For each player you need something like pos[3] and view[3], and a few other bits (fire, crouch, death, weapon switch/pickup/drop), and a few extra bytes for your own player (poss reduction of health/armour etc). With say 24 bits for each pos/view component that's about 20 bytes per player. A really large firefight in CS would be 10 players. ~200 bytes/downstream update. That's with all 10 in view btw, because the server does a decent view cull for players to complicate for zbuffer-type cheats.

The bandwith is a ballpark figure, I don't have the protocol details, but there definitely aren't any big synchronization updates going on.
So, first of all, if you use the above stuff, you're going to be using a custom number system. I doubt games do that. More likely it'll be 32 bits each for pos/view (maybe 16-bit ints for view: it's not very hard to pack those). There will also be bits for what type of weapon is used. But that doesn't change things significantly. The main thing is that each action (fire, crouch, death, etc.) needs to be attached to a time step to keep everything moving properly. So it's actually quite a bit more than that.

For example, I just joined a game of UT2k4 with 20 players. With a netspeed of 10000, my incoming bandwidth was sitting right around 9300 kb/s or so, with hundreds of "bunches" being sent each second. With the game taking up one quarter of my total available bandwidth (yeah, I know, crappy internet connection), I seriously doubt that bandwidth was not an issue.
 
60 fps minimum (60+ sustained) framerate.

I don't care what average fps or what max fps I'm getting in a game, only the minimum counts. Because it occurs when the going gets tough (lots of monsters or other players on-screen), and that's when I need the smoothest possible display and best visibility to stay alive ;)

After that, IQ maxed out. I can't see any difference between 60 fps (minimum) and 100 fps (minimum), although some see, so I prefer to burn the fillrate on IQ. But I sure can tell when an "average 60 fps" lets me down in a tight spot :p

[Edit: tpyo.]
 
Personally, I think 100 FPS minimum would be great - but being realistic I could settle for 60. What would be very good to fool you about your framerates is motion blur. Determine if an object has moved by a certain amount of pixels inbetween frames - If so render it recursively to a texture an amount of times equal to the amount of pixels it has traversed, that way you will find it much, much harder to notice a drop in framerate. The only downside is that in many situations that is liekly to drop your overall framerate anyway :?
 
Dave B(TotalVR) said:
The only downside is that in many situations that is liekly to drop your overall framerate anyway :?
For this reason I see motion blur as a way to reduce temporal aliasing instead of increasing your effective framerate (i.e. beyond monitor's refresh). That is, when the framerate is already high, use motion blur to remove some of the aliasing artifacts that are seen.
 
Indeed, the other problem is working out when it can be used, because I imagine ti woudn't be so easy to predict the amount of time its gonna take to render the next frame. I guess you could go by how long it took to render the last one but hmm.... :|
 
Dave B(TotalVR) said:
Indeed, the other problem is working out when it can be used, because I imagine ti woudn't be so easy to predict the amount of time its gonna take to render the next frame. I guess you could go by how long it took to render the last one but hmm.... :|

That's one reason why you would want to have a fixed frame rate :D
 

we are too human. However I can easily see the difference in many games, between 60fps and 165fps, while it's not as noticeable in others.

Even on the same genre, Vampire Survivors at 165fps feels very different from playing it at 60fps, and whey you play it at 30fps it seems a slide show.

However, in Army of Ruin, which is another auto-shooter, I prefer to play at 165fps but 60fps is okay.
 
I can easily tell the difference between 60Hz and 144Hz just moving the mouse around the desktop. Even my mother who is >60yo and doesn't know how to use a computer can tell. My cat who liked to track my cursor around could probably tell but she never admitted it.

In games it depends on how fast things are moving across the screen and whether or not the game has good motion blur. Good motion blur can make it harder to notice.
 
Back
Top