You are wrong.
It's easy to see the difference between 120 Hz and 60 Hz refresh rate. Just move the mouse cursor at windows desktop (in both 120 Hz mode and 60 Hz mode), and it's obviously visible. We have several 120 Hz monitors at our office, and everyone sees the difference.
Also I am personally very sensitive to light flickering. Bad CRT monitors at 60 Hz made my head explode just by watching the monitor briefly, and the headache lasted for many hours. I can clearly see the CRT flickering at 60 Hz, and I can even see it at 85 Hz (the default refresh rate for most CRTs). So I mostly ran my CRTs at 100 Hz, and I could still see the flickering at night time (dark room, no lights). Our eyes accumulate light (we do not see discrete images), so 100 Hz flickering should be completely invisible if we could only see 25 (accumulated) frames per second.
Some (old?) sources claim 25 fps is enough if the image frames are motion blurred ( incoming light is accumulated during the frame period). 25 fps is pretty good for motion blurred videos, but you will see a striking difference to 60 fps in scenes where there's lot of sideways movement. This is one of the (many) reasons why stereoscopic (3d) videos look pretty bad currently. Eyes seem to notice the low frame rate better on stereoscopic content. And I am not talking about active shutter glass flickering here, I am talking about the actual frame rate of the stereoscopic image stream (24/25 fps for both eyes doesn't seem to be enough, we need to shoot at 60 fps x 2 at least).
Humm.. well the first mistake is to claim that someone "sees" XY Hz.
Human vision doesn't work like a camera, there's isn't anything remotedly similar to a video system that captures a fixed number of frames-per-second through the whole capture area.
First, there's the fact that everything depends of luminance and contrast, even "refresh rate perception". So either it's a high-contrast, high luminance monitor or there's lots of daylight coming in or all windows are shut etc. makes a some difference. And it's not linear..
Then, there's the fact that both
precision (equivalent to cameras'
resolution) and
speed (equivalent to
refresh rates) perception depend a lot on the position of the field of vision.
The center vision is indeed very slow, as evolution has adapted it for precision rather than speed. The ~25fps perception is actually correct for central vision under , even if it changes a lot depending on the person.
As you go from the center to the peripheral vision, precision is disregarded in favor of speed. Evolution, once again, made our peripheral vision a sensor system against predators and hazards, so it's a lot faster than central vision. AFAIK, it can go up to 120Hz.
So what does all of this have to do with office monitors and suddenly people starting to notice the difference between 60 and 120Hz monitors?
Size.
Whereas 10 years ago we'd have 15->19" monitors, the panels are a lot bigger now.
And larger panels means a larger proportion of our vision is occupied by the monitor.
It means that not only our central vision is capturing the monitor content but also some of our fast-as-hell peripheral vision.
Having our peripheral vision looking at the monitors is what makes people notice between medium and high frame/refresh rates, and causes headaches and other problems.
Sorry for the slight off-topic, but I think it's important not to generalize "vision" as if we had a pair of digital cameras attached to our brain.
Furthermore, given this fact, I think it could be a very interesting subject of research to study a variable level-of-detail (anti-aliasing, texture resolution, etc) across the screen based on how close the pixels are to the center.
It could make a sizeable performance difference in games where people are always looking at the center of the screen (FPS, for example), and of course it would only work for large screens in desktops (where the FOV is mostly occupied by the screen).