Frame rate (fps) 29.97 vs 30

  • NorwayStock avatar
    NorwayStock 27 Jan 2013 07:31
    When generating time lapse sequences I am thinking what is the preferred frame rate? Would like to hear what buyers (editors) prefer.
    Shooting footage I always make the timeline 29.97 fps or 25 fps (if I want the european format). But for time lapse I have the choice when generating the sequence to be set in 30p, 29.97p, 25p and so on...
    Until now I have generated the time lapse sequence in 30p but I wonder if this is the most appropriated? Would it be better to generate 29.97 make it the same as regular footage?
    Any opinion?
  • markoconnell avatar
    markoconnell 27 Jan 2013 07:40
    I make mine 29.97, but I don't think the difference between rendering at that and 30 fps is significant. They're both really 30 fps, the 29.97 is a NTSC thing. Drop frame timecode. An antique that insists on persisting.

    "The Difference Between Frame Rate and Timecode:
    The frame rate of your film or video describes how rapidly frames are photographed or played back. It refers to the physical speed of image capture and playback. Timecode is merely a method of labeling frames with unique identifiers to easily find them again later. It is a convenient way of giving each frame a name that can be referred to later without having to verbally describe and visually search for it. Even though frame rate and timecode are independent, people commonly confuse the two, which can lead to frustrating problems in post-production. Before you start a project, be certain that you understand the difference between these two terms."

    FCP User Manual

    Excellent longer explanation here:

    "Re: Will 30 fps work when specs call for 29.97?
    by Tim Kolb on Feb 16, 2011 at 2:21:53 pm

    Alex Udell asks: "Isn't 29.97 short hand for "drop frame time code?" which is a method of counting frames, not a method of drawing them?"

    Drop frame timecode was invented to accurately count elapsed time for 29.97 frames per second...that's why drop frame timecode exists.

    Video frame rates were originally set up to the electrical system the television system was using...The USA (as well as other countries) had a 60 Hz power grid so 30 fps (60 interlaced fields) made a nice synchronization and it just minimized the conflicts between frame refresh and power frequency. PAL is 25p/50i because the electrical system in the countries that adopted it was running at 50 Hz instead of 60 Hz. (PAL also has 100 more lines of resolution...France actually broke away with SECAM and were experimenting with broadcasting 1000 line plus images right after World War II...but that's a story for another time.)

    When it came time to add color to black and white in the NTSC... (PAL standards were established later and therefore were better prepared for color images and until the recent advent of digital television, PAL SD images simply slaughtered NTSC SD for accurate, high quality color rendition...but once again, I digress)...a compromise had to be made if that huge installed base of Black and White televisions were not to be made instantly obsolete. Americans may be the most backwards compatible culture there is in some of these areas (some would say that "compatible" is not a necessary component to that description), and TV was one of the biggest examples.

    In order to create color images that color TVs would see, but maintain B&W images that older sets would see, the color would have to be a "layer" that the black and white sets could simply ignore. If you look at your waveform monitor while looking at an analog NTSC signal, you'll see that small vertical rectangle that straddles the line at 0 which comes in between the frames...that's the "burst"...which is the modulated color signal. If you look at a waveform and see no burst, the image you are viewing will be black and white as the "color layer" that gets laid on top isn't there...

    Well...this color information was...more information. Since the outgoing signal couldn't simply be structured as a color signal natively, the color info was added to the information stream...which took some space. Since the FCC wasn't willing to re-issue broadcast licenses for a "channel and a half" of band-width...the solution had to be something done serially.

    Slowing the frame rate by 3/100ths of a second allowed the necessary space to stick in this murky color overlay so that it could be used by color sets, but the framerate was still usable by the existing 30fps black and white sets. It's why "analog component video" in NTSC terms is still one Y' channel (the black and white signal) and the metaphorical longitude and latitude color coordinate system "Cb and Cr" (digital) and "Pb and Pr" (analog...Sony also labels them as R-Y and B-Y) to add a given hue and saturation to each luma pixel (SVHS separated composite into "Y" and "C" in the cable, but it was still just a more cleanly handled 'composite' signal) as opposed to an RGB additive system, which would store the information with far better quality albeit a higher data load, all things being equal...

    The problem with 29.97 frames per second is that you still need 30 frame numbers to account for the images that occur every second, but you end up with an accumulated error...about 3.6 seconds every hour, amounting to well over a minute a day. Mistakes that would happen because of the error could cost lots of money in botched air slots, etc. A minute is two commercials and each commercial is revenue for the TV station and when it comes to money...broadcasters respond.

    By removing ("dropping") the first two frames (XX;XX;XX;00 and XX;XX;XX;01) from every minute EXCEPT each tenth minute, you create an incremental correction that keeps the studio clock and SMPTE timecode on the same page throughout the broadcast day.

    When it comes to web-based video, I have no idea why the OP's client specified 29.97 other than they have some old TV dog like me who has no clue that the web doesn't use fractional frame rates since there is no "color under" in Flash or H264...therefore I'd think deploying a video on the web would favor 30 fps even...but there may be some reason why the client uses DF timecode internally or something.

    If you start the program out as being broadcast on television in a formerly "NTSC" system (there is no NTSC and PAL in HD, only framerates, even though some professional equipment manufacturers still label their stuff that way)...then you have to have 29.97 and DF timecode because even though we no longer have a need for the stupidest video framerate decision ever made (ooops...did I type that out loud?) as digital television no longer needs to allow for this...we have to be the most backwards compatible culture ever...

    Therefore the framerate compromise we (NTSC countries) made in 1953...still haunts us today.

    Don't even get me started on why American NTSC black is 7.5 IEEE gray...

    It may sound like I'm old and cranky, but really...

    oh, excuse me...

    I need to get my walker over to the window and yell at some kids on my lawn...

    Director, Consultant
    Kolb Productions,"
  • Changed 27 Jan 2013 08:27 by markoconnell "adding info"
    Changed 27 Jan 2013 08:38 by markoconnell ""
  • NorwayStock avatar
    NorwayStock 28 Jan 2013 19:34
    Thanks for your replay Mark! Interesting reading.
    I took a scan of most sold time lapse clips at P5 and there are both 29.97 and 30 fps high number of sales, so I conclude with that this is not so important. Think I stay with 30 fps.
  • SimpleIconic avatar
    SimpleIconic 28 Jan 2013 21:35
  • jason avatar
    jason 28 Jan 2013 22:59
    Everyone has answer as to why we in the US use 29.97fps instead of 30fps. So here's the reason 30fps is no longer used.

    So the frame rate of television was actually exactly 30 frames per second at one point in time. However that all changed when color television was introduced. When a signal for color information was added to the television transmission there was a big problem. The color carrier signal was phasing with with the sound carrier signal because they were very close in the spectrum. This made the picture look un-watchable. The quick fix they came up with was to reduce the framerate by .03fps which moved the two signals out of phase.

    We have been stuck with this frame rate ever since.
  • wideweb avatar
    wideweb 29 Jan 2013 19:08
    Actually 30.000 - 29.976 = 0.024