Hmm, a long time ago I wrote a little article called 30 vs 60 FPS. In it I gave some physiological information about the eye, how it perceived things, etc. etc. Too bad it had a lot of innaccuracies, but I think I was on the right track. Here are a few things that I have learned over the years:
The human eye is very, very sensitive. While Chalnoth says that it takes our eye 1/8th of a second to get a response, that isn't exactly true. It may take 1/8th of a second for our eye to move to a certain part of the screen and focus there, but we are still receiving tons of information even while the eye is in motion. The rods and cones can detect changes that are as little as one photon, and these rods and cones then transmit that information to the brain. So while the eye takes that time to move and refocus, it is still receiving information that our brain can process. For example, I may be staring at the center of my monitor, but I can still perceive my fingers moving on the keyboard, the post it notes on the edge of my monitor, and out of my extreme periphery I can see the bookshelves next to my desk. In this time, I am getting a constant stream of information from my eyes, as they essentialy receive an infinite amount of information from the universe (infinite in that light is both a particle and a wave- really confusing stuff if you ask me).
So, now that we have it out of the way that our eyes are not necessarily limiting us, lets go a little further. Chalnoth talks about motion blur, and he has some solid reasoning behind this. For computer though, if they are rendering scenes fast enough, and the monitor is refreshing itself fast enough, we will start to see some basic blur. The reason behind blur is that the eye cannot focus on fast moving objects. So that fast moving snowflake heading towards your cars headlights will give the blur effect (there are other things at play here also, such as the eye trying to adjust to a very bright object, and the effect of neurotransmitters still stimulating a nerve ending). The human eye cannot focus quickly enough, so we see this effect. In film, the effect can really be seen, but that is mostly due to the camera filming at 24 fps (which is really low). Though in a theater that 24 fps film is refreshed at 3x that, so we don't see any of the flicker that running at a really slow 24 refreshes would exhibit.
For the next step, lets go to Imax. This has a couple of really interesting features to it. First of all it is filmed at a full 60 fps, and then it is refreshed at either 120 Hz or 180 Hz (I am not sure which, but one of the two). The developers of Imax discovered two things: the first was that 24 fps caused massive artifacts in filming and then projecting onto the huge screen, the second was that at 60 Hz refresh the flashing that was experienced at the peripheral vision (due to the fact that it is made up almost entirely of very sensitive rods vs cones) made people nauseaus after a while. Once they increased the refresh rate, the nausea went away.
Ok, that was step two, lets take another example leading onto our conclusion. I belive ILM did an experiment with people on a roller coaster simulation. First they filmed the coaster ride at 60 fps, then at 120 fps. They then put a bunch of people in front of a screen and showed both films (being refreshed at 120 fps). Apparently, with several groups of people, the incidences of motion induced nausea were significantly higher for the 120 fps movie. It is theorized that the more fps seen gave a more visceral experience. Not highly scientific, but interesting nonetheless.
What I am leading up to is that they eye can detect even very small changes in the visual spectrum, so the higher the fps, the better that experience will be (and more realistic for the eye). Now, I believe 60 fps should be the minimum for a good overall experience (and this is game dependent in many ways), but essentially all applications would have improvements if they rendered at this speed. Now, 85 is a nice number because the frames can synch with the refresh rate (giving nearly flicker free enjoyment), all the while not exposing the visual artifacts of the monitor rendering one frame at the top of the refresh, and going to the next frame in the middle of that same refresh. This will give a good gaming experience for nearly everyone, and it is at a refresh rate that most monitors can handle at decent resolutions.
Now, my dream setup would be a monitor that could handle 1600x1200 at 120 Hz, and run the application at that resolution and framerate, and add in 4X AA and 8X AF to the mix. I think that would give the greatest experience that current technology could deliver. Unfortunately, a monitor like that is VERY expensive, and most games aim for the 60 fps mark in terms of delivering speed and quality.
Now, if future hardware could deliver a monitor that can do 240 Hz refresh, and video cards could render at high quality with AA at 240 fps, then I think that if you put the 120 and 240 computers side by side, we would still notice a difference between the two. Most likely though, we are starting to hit the point of diminishing returns. It isn't economically feasible to produce a monitor that could do 240 Hz at 1600x1200 at this time, though a SLI setup could possibly deliver such frames at 4X AA on an older title (but not on Doom 3 obviously... since it is locked at 60 fps).
Anyway, don't underestimate your visual subsystem's overall abilities. It is a very precise and sensitive group of structures. On the computer, higher FPS will always give a better illusion of continuity, so there really shouldn't be a cap at 60 fps on any application when it comes to rendering. I must admit I much prefer Counter Strike: Source running at 1024x768 at 120 Hz/ 120 FPS than I do Doom3 at 120 Hz/ 60 FPS (it is night and day when it comes to smoothness, and in a firefight with quick motions I find that I am far more accurate with CS than with Doom).
Just my simple observations here.
Edit: made some simple corrections.