The effect of motion blur on our perception of frame rate
I got my trusty CRT monitor capable of high refresh rates from the basement, to try my recent motion blur demo on it. It looked great on my 60 Hz LCD, so I had high expectations for 120 Hz.
I was shocked with the outcome. As expected, going from 60 to 120 Hz made a huge, noticeable difference in my demo with motion blur off. But turning it on, there was no significant difference. For the first time, I couldn't believe my eyes.
As a competitive gamer and someone very passionate about display technologies, I am very familiar with the concept of refresh rate. The more, the better, I always thought. 60 Hz is okay, but 120 is much better! I could always easily tell the difference. Until now.
After thinking about it, it made sense. This is probably the root of all "but the human eye can only see at most X frames per second" arguments on the internet. It matters what the frames are, or more precisely, how distinct they are. I can easily tell the difference in frame rate when they are very distinct, but I wasn't when motion blur made them less so.
You can experience the motion blur demo for yourself if you have a WebGL-capable browser.
The examples are abundant. Take movies. At 24 frames per second, the motion looks acceptably smooth. But take any video game running at 24 frames and it's unplayable. The difference is in motion blur. Each frame in film captures the light for roughly 1/24th of a second. In a video game, each frame is typically rendered as if with instant exposure time. For these type of frames, yes, the higher the frame rate, the smoother the motion appears.
I am planning to confirm my findings on a 120 Hz LCD monitor soon, but I am certain that the results will be similar.
I think you might find that film frames (shot at 24fps) will be exposed to light for only 1/48th of a second. The shutter is usually open for only ‘half a frame’.
Nice demo!