The framerate I said above is needed if you have a white vertical line on black background, moving fast horizontaly.
With high enough framerate it would be smeared into one gray area, but with "too low" framerate, you would see discrete lines. The limit I said is where the errors from not having an infinite fps is hidden by the bluring from spatial filtering.
But of course it's theoretical, and not the framerate you would need in practical cases. It's just that this limit is the only one I can see that correlates directly to screen resolution. If you decide that a lower framerate is enough, then it's not directly correlated to screen resolution, and looking at the same object(*) at a different resolution shouldn't need other framerate.
*) Here I realized that while I still stand by that statement, it's not the full story. The catch is that using a higher resolution often change what you're playing.
Higher resolution =>
=> enemies are visible at further distance =>
=> you're tracking smaller objects (measured in mm) =>
=> you need higher fps
So while the higher resolution don't change anything directly, the changed playing-style that comes with it might. This is kind of related to a (at first sight strange) comment I heard a lot earlier. A lot of people didn't like to play at high resolution, because everything got so small and hard to hit. It took me some time before I realized that they'd started to shoot stuff from further distance without thinking about it.
I'm not sure which way you're debating wih the "pixel"/"picture element" comment. It seems as the argument says that higher resolution flickers less. (Notice that "picture element" size is constant for a monitor.) I definitely agree with that for interlaced monitors. On a TV with picture elements that are blury enough, you don't see the 25/30Hz flicker, just the 50/60Hz.
For low framerates (were I would call the error a "stroboscope effect" rather than "flicker"), I would concider it relevant in the calculation I made in the last post. But not so much when doing the "high res=>different gaming" reasoning in this post.
[Edit]
This post was mostly a reply to what demalion said above.
MistaPi:
What I was saying was: While there are ways to deduce a reason that higher res would need higer fps, the arguments aren't strong enough to be of much concern. Possibly with the exception of the "different gaming" argument in this post.
There are different kinds of errors in real time gfx, and they disapear/get less anoying at different framerates. So it's natural that we can deduce very different "good framerates". One error is when you sense the time steps in the gfx (there was one frame, and there's another). This is removed at lower fps (be it 25fps, 60fps or 90fps, all depending on the viewer). This is when the motion starts to feel "fluid", and what I would say the most important fps.
Then you have the error that whatever framerate you have, it's possible to have a bright object fly past the view in such a high speed that it leaves a trail of distict objects. To remove this you could need a very high fps.