Sandwich said:
I do not think the physics engine would become significantly less accurate at 'only' 60 FPS when compared to 200.
...
There are examples of raw accuracy differencies, (Q3 jumping distances as mentioned above), and IIRC in CS there are weapons often firing faster than your updaterate, giving you different weapon recoil for different rates. However I think this is beside the point, as these are implementation details and not that common. What matters is game mechanics, where in some games the game state updaterate is very important, discussed below.
What about user input? 60hz. Assume vsync. triple buffering. I think players can experience the latency of 17 ms between pressing the button and firing on screen. At 60fps the latency between doing and seeing is 17 to 33ms.
At 200fps(60hz) that latency would be 5 to 17ms, depending on when you press during vertical trace.
In theory the extra 140 frames should make smoothness appear twice as good. But in practice?
The most obvious reason for how update rates affect game mechanics is the amount of latency added to reaction time. I would say this makes a small but not insignificant statistical difference in several games. For example, every CS vet knows a connection with 20ms latency gives a clear statistical advantage over a player on 40ms network latency. Likewise a client running at 100Hz throughout has a corresponding advantage of a 50Hz client, albeit the difference in apparent reaction time is only half of the other example (and assuming a 100Hz server). But there are several other of factors in CS dependant on client updaterate that under circumstances can add up. Let's say two opponents come into view of each other, both running. To avoid accuracy penalties built into the engine, they must both stop before shooting (unless it's close range in which case the penalties will be statistically acceptable). It was a while since I played so I can't say how long this time is (it takes a brief moment to stop), but I do know that on say a 50Hz client (or connection) it will be harder to time this correctly (due to temporal quantization) than on a 100Hz client, so to be sure to hit you will have a noticeable amount of latency added, on top of other latency differencies. And I think I could come up with a few more examples from CS, factors that sometimes could add up in latency.
Then theres the issue of just the display rate. While this has been discussed lots above, I don't think anyone has outright mentioned the effect display rate can have on reaction time, besides the "raw" latency. I instinctively think that the "biological" latency can be improved by higher diplay rate in the case of aiming in shooting games: with a higher rate, you can appreciate the motion vector of a target faster. I cant say I know much theory around this though.
Anyway I guess one of my main points actually is that I believe many non- or casual games dont appreciate how much difference some added milliseconds here and there makes for a really good player, something Entropy also has been on to above. As examples, if you watch a recording of a top CS match, the time taken actually aiming (between reaction time and shooting) are (IIRC) mostly 2-3, sometimes a single frame, at 100Hz. And the team captain of one of the top CS teams not that long ago (12-18mth?) said that CRTs was still superior to flat displays for their purposes. In other words, display and rendering rate, as well as client game, network and server rates, *can* make a noticeable difference, but not generally, and definitely not for all users.
Hope this doesn't sound too confused, it was a while since I discussed these things.