You've lost me. Display tech is limited to 120 Hz, 144 or 240 at a push in the most niche situation. This makes IO polling at screen refresh impossible. And if the displays are capped at 120 Hz, there's no point IO pollling at 125 Hz if you don't want judder. Just poll at 120Hz. Or poll a lot higher.
I don't see the relevance of the post on 1000 Hz IO polling and screen refresh rates. The suggested frequencies are daft, the screen refresh rates unrealistic (125 Hz monitor refresh when typically high refresh is 120 or 144 Hz), and the technology not relevant to anything outside of a laboratory.
Except there is no guarantee that the input device was poled at the same time the frame started rendering. It's similar to how even if you are rendering at a locked 60 fps on a 60 hz monitor, if you don't have vsync enabled, you can still have screen tearing. Nvidia's Gsync and the VESA standard Adaptive sync will deal with it on that. But that is much simpler than trying to synchronize device input with a game's rendered frame.
With PC's especially you have a fixed mouse polling rate. Yet frame render times for games can vary greatly. Even more so when you have budget/midrange graphics cards or a budget/midrange CPU and frame render times will vary on a frame by frame basis.
Where with a locked 60 fps game your input might be offset from frame render by as much as a full frame (if by bad luck your game received the last input the OS received which just happens to be 2 ms after your frame started rendering for example), it will at least be consistent, so not bad on average but still noticeable to professional gamers.
Throw in a game that might be rendering 45 frames per second and suddenly a locked 60 hz mouse polling rate is going to have erratic differences between input and output. Again nothing your average gamer will notice but would be noticeable to some.
Now if a game is varying between a 30-60 frames per second and then even your average gamer might start to notice.
Having a high polling rate (say 500 hz for example) means that no matter when a frame starts rendering, you will have an input point very near to the start of the frame rendering.
Perhaps it's just not something a non-competitive (professional) gamer would understand/comprehend/appreciate. Just like wondering why would professional gamers lower image quality in an FPS such that they can get 200+ FPS when they only have a 60 hz display.
Everything is done to get input delay and display delay as small as possible. I've known some professionals that could actually reliably distinguish between a wired mouse and a wireless mouse due to the minuscule input lag induced by wireless. In this world single digit millisecond differences in response can be noticed.
Again, not something your average gamer is going to notice, ever. So, irrelevant to the majority of gamers. Even those that think they are elite gamers probably wouldn't notice even though they pay higher prices for those types of devices.
And on consoles is less relevant, usually. Although even in console land you rarely have games that are locked 100% of the time to 30/60 hz during gameplay segments.
Regards,
SB