In fact, the frame capture by the human eye is variable and difficult to measure because it is a biological process. In a camera, the entire image is captured at once and sent for processing. In the eye, capture times vary depending on the cellular response of each part of the eye. That’s why it’s hard to talk in terms of the eye’s fps.
What happens is that there is conscious and unconscious perception of what happens. For example, most people can consciously notice something strange if you insert one frame every 24 frames (for example, noticing that there is a “Jequiti”). And most people will not notice if you insert one frame every 30 frames, at most feeling that something odd is going on.
On the other hand, if you play a game at 40 fps, you will feel that something is not natural, but it will feel more natural the higher the game’s fps. Most people do not feel a difference above 100 fps.
But there is another issue: the synchronization effect. So, if the monitor is at 144 Hz but is not synchronized with the game’s frame generation, this can cause an annoying effect. So the key point is more about the relationship between the game’s fps and the monitor than about the eyes themselves.
But regarding your question: most people will not feel a difference if a game is running synchronized at 144 Hz/fps or above. However, one should not underestimate the vision of someone trained for this, such as gamers. It is expected that a gamer will have more sensitivity than if they did not play, raising these limits a bit (which is not my case).
Your answer is really awesome, man. Now I get it. So if you’re not “professional” and don’t play in competitions, 120Hz is already more than good enough then, for someone who plays casually, right?