Video is created by showing lots of still pictures quickly in a row. In filming, each of those still pictures is called a frame. The number of frames that appear each second is called the frame rate. It’s measured in fps, or frames per second.
In the United States the most common frame rate for television is 29.97 fps (though it is commonly approximated to 30 for ease of discussion). In Europe it is 25 fps. The reason the numbers are different is goes all the way back to when television was first created. Back then, there were no digital circuits. Everything needed to be mechanical (analogue) in nature. And when TV signals were broadcast the engineers needed something to use as a timer to keep the video in sync.
The agreed upon method was to use the frequency of the oscillation of electricity. In the US this was 60 Hz (the AC current cycles at 60 cycles per second). Engineers used half of that frequency to create their timing circuit that would keep the video in sync. In Europe, electricity flows at 50 Hz. So half of that is 25 fps.
After colour television was introduced, it turned out the new signal suffered from interference with the sound carrier signal in the US. This made it so the TV image wasn’t clear. To solve the problem, they reduced the frame rate just a little. This eliminated the issue and left the US with a frame rate of 29.97 fps. But most everyone refers to it as 30 fps.
These standards still remain for video streaming online, but there are applications where 15 fps, 24 fps, and even 60 fps is used. The higher the frame rate, the smoother the video will look. That is ideal for fast action video. The trade-off is that a higher frame rate means more data, larger files, and greater bandwidth needed.
A rule of thumb is to encode the lowest frame rate you can, while maintaining the integrity of your video. This helps reduce bandwidth requirements and other issues viewers with limited Internet connections might have, like buffering and long load times.« Back to Glossary Index