True , however I didn't include this test to imply that the eye can distinguish a frame rate count of 220 fps , it is just another evidence that points toward the capability of the eye to see high fps , be it 60 , 100 , 120 .. etc .
It really doesn't.
First you need to understand how eyes see images. They collect light, and the "signal" they send to the brain is continuous, and always takes into account the past x ms of time. This in comparison to how videocameras work, where the signal is discrete, and takes into account a certain timeslice of collected light (so no event can be seen in two signals).
This way, a bright signal that lasts for a very short duration will be seen for that whole x milliseconds, and is in effect the same as a much dimmer signal that lasts longer. This is how CRT screens work -- if you have ever taken a photo of one with a short exposure, you know that only a very small portion of the screen is lit (and only a single pixel is lit brightly) at a time. But as that travels the entire surface of the screen in a time smaller than that x, it looks like the whole screen is lit.
So, if the signal is bright and distinct enough, it will be seen, completely regardless of how short it is. This has nothing to do with how quick changes your eyes can see, or how much fps you need.
As for that, your brain really cannot process much more than a few tens of fps worth. But, it is good in noticing irregularities, and pays attention to them. So how much fps you need depends very much on how well you can fake that the screen is changing like the real world. That's why movies at 24 fps and good motion blur can look better than a desktop with 60 fps and no blur at all.
So how much fps do first person games need? Probably more than 120 if you want to do it naively. But if you spend time doing psychovisual optimizations, you might well beat that 120fps in smoothness with 30.