There is alot of misinformation floating about. The human eye does not see in frames per second, we do not have shutters in our eyes that open and close and let light in. A single frame is a still image. As frames are flashed faster and faster our brains begin to process this information and somewhat perceive it as fluid motion. In "real" life the eye tracks smooth movement and transmits that information to the brain. Part of the brain will group parts of information together, or omit information as necessary for the situation. Take for instance while you are driving and are calm and at rest, your brain will omit unnecessary data which is why you may not notice something dart into your path at first, once the brain is "startled" it will begin to processor visual information much more rapidly and omit quite a bit less which produces the "slow motion" feel people encounter while under duress.
That being said, in the past they standardized 24fps as the standard because at that setting while the body is at rest and relaxed that is the acceptable minimum amount of frames the brain can process into "fluid motion" or what we would perceive it as. During gaming, or highly stressful movies our brains begin to process more and more information more quickly and it is now widely accepted that fps much higher than 24fps will yield a more desirable outcome.
The argument then becomes, not what is the maximum fps the eye can see, but what fps will yield the more disirable experience for the situation. Now everyone's brain reacts differently to a given situation, while some remain very calm during gaming 60fps may be enough and anything higher may be unnoticeable, for some people who become excited quicker 80-120 fps may become noticeably smoother and yield a more disirable experience.
Bookmarks