Frame Rates

Frame rates refer to the number of individual images displayed per second when a piece of video is played back.

It is often written as a number followed by fps (frames per second), e.g. 25fps. Depending on your delivery requirements, the country or broadcaster that you are working with, and the look that you want to achieve, there are a number of different frame rate options when delivering programmes.

It’s important to ensure that the frame rate does not change during the output from the edit, or during a programme. The list below outlines a number of different frame rate options.

24 fps

24 full-frame images per second is the standard for many feature film deliveries. This frame rate is typically used for recording and for mastering for cinema and high-end drama. DCPs , the Digital Cinema Packages used to deliver films to cinemas, can use this standard (InterOp or SMPTE) as well as delivering in higher frame rates.

50Hz vs 60Hz

TV refresh rates were historically linked to mains electric grid frequencies, and as a result two main frame rates exist in TV: 25Hz (main use in Europe, Africa and some parts of Asia) and 30Hz (main use in the Americas and parts of Asia). These two standards divided the world in analogue TV. NTSC originated in North American and is essentially based on a frame rate of 30 fps, which goes back to the design of the US power grid.

PAL was used in Europe and is based on 25fps, which was tied to the European electrical frequency. 

When NTSC introduced colour television the frame rate was lowered to 29.97 Hz. This so-called "Drop Frame Timecode" (as opposed to a "Non Drop Frame Timecode") loses the frames 00 and 01 once every full minute. There are also drop frame variants for 24 and 60 fps ( 23.98 and 59.94 fps.)

Frames and fields

All frame rates mentioned above are full frame images. However broadcasts are actually delivered as either progressive (p) or interlaced (i). 25p actually corresponds to the standard of 50i.

For more information on Interlacing see the article on Progressive and Interlace.

It is important to understand what the final delivery requirements will be when configuring your camera equipment. If the sound is recorded separately from the video you should apply the same settings to the sound recording equipment to keep the pictures and sound in sync.

When dealing with drop frame timecodes, 24 vs 23.98, it’s important to be consistent with your production settings, otherwise this can lead to pictures and sound being out of sync, flicker and post-production-related problems.

Frame Rate Conversion

If the production does not have a final delivery specification or has not decided on a frame rate for acquisition, it may be possible to record in a given frame rate and then apply frame rate conversion to the materials. This is not advised, as the frame rate conversion process can cause image artefacts or cause issues with picture and sound synchronization. In addition, a frame rate conversion will change the running length of a film.

HFR - Higher frame rate

HFR describes the process of recording, as well as playback, of material at much higher framerates than are usually delivered for film and TV. For films this could be between 48, 60, 120 (or more) frames per second, however there is already discussion about 120 fps for UHD and even more for high quality VR experiences.

The advantage of this technology is that it helps to reduce motion blur, which results in a sharper image. A disadvantage is that, besides the growing resolution, HFR inflates data rates and file sizes, which increases the challenge of moving, managing and processing huge amounts of data. One of the early cinematic uses of this was by Peter Jackson for the Hobbit Trilogy.