24 fps: Where Does It Come From?

Back in the day (turn of the last century) there was no such thing as camera batteries or sound men. Men were men and cameras were hand cranked. As they had evolved from still cameras they were sort of still camera Gatling guns, capturing still frames as fast as you cared to crank. Somewhere past 14fps something magical happened and persistence of vision started to fuse the images so rather than a fast slide show it started to look like motion. So cameras were built to move one linear foot of film per two cranks, which meant if you cranked at “coffee grinder” speed you hit 60 feet per minute, which comes out to 16fps, just north of that 14fps effect. Cranking faster improved the persistence of vision thing, but producers didn’t like you blowing through all that expensive film, and besides there was really only one stock available and it was slow, about 24asa. Cranking faster meant less light per frame. Sometimes you cranked even less than 14fps to squeeze a bit more exposure out of it.

This is where it gets a little weird. Projectionists, who were also often hand cranking their projectors had a habit of cranking faster. Faster meant faster turnaround in seating, which meant more $ and even better persistence of vision without annoying flicker. Sure, action was sped up, but the whole thing was new, no one seemed to complain. In fact, by 1925 The Society Of Cinema Engineers (now known as SMPTE) had codified it recommending

60 feet per minute (16fps) for camera speeds and projecting at 80 feet per minute (21.3fps) seems weird now to pick a different speed for display from capture, but to review, faster cameras cost more money, and faster projectors made money, and after all, producers are paying for everything.

proposed standard cranking and projection speed circa 1927 from SMPE

proposed standard cranking and projection speed circa 1927 from SMPE

Anyway, someone decided it would be a great idea to add sound. How hard could it be? In fact, several companies tried to be the first to bring sound to the movies, hoping to capture the market. Funny thing is they all insisted on capturing at the same frame rate they displayed at. If you didn’t, the pitch would be all wrong and everybody would sound silly. And forget about music. Some picked 80 feet per minute (the already established speed for projection), some picked 85 feet per minute, and some picked 90 feet per minute. First one to get a working system was Warner Brothers Vitaphone. It was used in the 1927 “The Jazz Singer” which was the first feature length film with sync dialog and is considered the official start of the “Talkies.”


Western Electric’s Bell Telephone Laboratories (and their Vitaphone system) as well as other systems listed taking speed and projection speed (SMPTE 1927)


The Vitaphone engineers had picked 90 feet per minute, or 24fps as their capture and projection speed. If one of the others had been first, we easily could be shooting 21.33fps or 22.66fp as a standard today. So sometimes you get lucky.
Except the Vitaphone system was terrible. It sounded good but that’s all that could be said about it. The sound was recorded on 16″ disk records separate from the film. They could only be played 20-30 times before they were no good, and they could break, so you had to send lots of duplicate disks with each roll of film to the projectionist. A disk only covered one reel so every reel change you at to cue up another record. And synchronizing the needle with the head of the roll was a pain in the ass. And if you broke the film for some reason and spliced it back, everything past that point was out of sync. During recording, the camera had to be motor powered from the mains, and the disks had to be made in recording booth adjacent to the set. In fact it was such a bad system that it was abandoned 5 years after it was implemented. And it only lasted that long because all the theaters that wanted to have sound had bought into that technology and had these crazy phonograph contraptions connected to their projectors and weren’t eager in throwing them away just after having bought them. Movietone, which used technology that put the audio as an optical track on the film had many advantages, but it was a little late out of the gate. Because Vitaphone was first, the engineers of Movietone decided to match the Vitaphone frame rate.

“Originally we recorded at a film speed of 85 feet per minute. After Affiliation with the Western Electric Company, this was changed to 90 feet per minute in order to use the controlled motors already worked out and used in the Vitaphone system.  There are a large number of both Vitaphone and Movietone installations scheduled and in operation, and sufficient apparatus is involved to make it impractical to change the present practice of sound reproducing.  In connection with the Society’s standard, I have been unable to find any New York theater which is running film at 85 feet a minute; the present normal speed is 105 feet and on Sundays often 120 feet per minute is used in order to get in an extra show”

Earl L Sponable, Technical Director, Fox-Case Corporation, New York City (“Some Techincal Apects of the Movietone” S.M.P.E. #31 September 1927, Page 458)

Soon enough Movietone lost ground as well, as technology changed but all subsequent sound systems stuck with the now established 24fps. So blame a sound man. Or thank him. Your choice.


One of the first sound men checking a Vitaphone recording with a microscope while recording. Sort of a human playback head. (page 308 from Transactions of S.M.P.E. August 1927)  It turns out this man is George Groves.


Postscript: Now of course, we often mean 23.976 fps when we say 24 fps.  This one we can’t blame on sound.  23.976 fps as a camera frame rate can be blamed on the introduction of color to standard-def television broadcasts in the 1950’s, and the death of film as a capture medium, and by extension the death of telecine as a post process.

When TV started, it did not match the 24 fps established by film.  This is because engineers wanted to use the 60Hz cycle from our 110v 60Hz household power to drive frame rate.  60Hz meant 60 fields, or 30 frames per second, and was pretty easy to implement.  Once color came along in the 1950’s they wanted a standard that would be backwards compatible with black and white TVs.  Engineers could no longer use the 60hz rate of the household electricity to drive frame rates and keep the color and luminance signals to play nice so they settled on a very close one of 59.94hz.  This resulted in a frame rate of 29.97 fps, from the previous 30 fps, something the black and white receivers would still work with.

Telecine: in order to get film onto TV you had to do a step called telecine.  the film was played back and captured essentially by a video camera.  Getting 24 fps to fit into 30 fps was done via a clever math solution by what is called 3:2 pulldown.   There are two fields to a standard-def frame, and thus 60 fields per second, and 3:2 pulldown would use one film frame to make three fields (1.5 frames) of video.  Then the second film frame made two fields (1 frame) of video, and the third frame made 3 fields again and so on.  Doing this, 24 fps fits quite nicely into 30 fps broadcast.  and anything shot 24 fps but shown at 29.97 fps system would look like it had been shot at 23.976 fps, even though the camera had been running at 24 fps, as anything that ran through the telecine went through a 01% slowdown to conform to the 29.97fps broadcast standard.  Somewhere in the transition to High Definition 23.976 became codified as a standard, not only for broadcast, but a capture speed. As cameras more and more were digital and not film, they would choose 23.976 as the actual camera frame rate, rather than 24fps and expect the 0.1% slowdown to happen upon transfer from film to video, as had happened to film in telecine rooms.  No telecine? no slowdown, which meant it had to be implemented in actual camera speed.

So, hate 23.976 fps? blame a Sound Man, color TV, the death of film and the whole accidental way we pick our standards.


For those interested in reading more, I highly recommend reading online records of the Journal of Society of Motion Picture Engineers, made available by the Media History Project. http://mediahistoryproject.org/technical/



Why I hate UAV copters

Drones. UAVs. Octocopters. Call them what you want. They are the new disruptive technology in a lot of applications, but I am specifically going to talk about them as they apply to my industry, as a camera platform for dramatic, narrative, or commercial work. You can see the allure- it lets you get shots that otherwise would be difficult or in some cases impossible via traditional methods. And camera movement is the best way to add production value to your shoot.
And I hate them. I hate them like I hate steadicam. What’s that you say? Hate steadicam? What kind of Luddite or backwards filmmaker are you? Let me explain. I do not hate the steadicam device per se, and I completely agree that steadicam allows for shots that could not be obtained any other way. I might even be convinced to use one some day. Here is what I hate about steadicam. People act like it is the solution to everything and will make everything awesome. A Steadicam is not awesome sauce you get to spread all over your shoot. It has weaknesses, just like any camera platform. Let’s review. A Steadicam can not provide a stable horizon on a static shot,
especially after it has been moving, which is why you use Steadicam in the first place. There are operators that can mitigate this, but it is inherently difficult on this system, yet directors insist on designing shots completely blind to the weaknesses of the platform and Steadicam operators struggle to make the shot work.
Another misconception is that steadicam systems are fast. Tracks don’t need to be laid, it has the freedom of handheld, you can just go. The fact is often quite the opposite. Steadicam can cause the shoot to slow way down. First of all, the whole system of a steadicam requires that the rig be balanced. This means a lens change, an addition of a filter, adding a timecode box, all require time out to balance the rig. If you are dealing with a shoot with only one camera body, to go from tripod to Steadicam can be a very involved process and ties up the camera during that process.
Once you have the camera balanced and on the rig, another thing to keep in mind is that a Steadicam rig with a camera on it is quite heavy. Between the camera, the post, the counterweight, wireless transmitters, arm and vest, it can tax the best operator. This means the operator needs to park it on a stand or docking station when not actually executing the shot. This makes blocking and lighting the shot a bit more difficult as it is best done while the operator is wearing the sled, which you want to keep to a minimum to keep him or her fresh.
Also, as the camera has potentially 360 degree movement, lighting can be a challenge. Nowhere is safe, and lights need to either rig directly in the ceiling if possible, be hidden somehow, or travel with the camera. Again, all this can be done, but none of it in the category of “fast.”
So, lets review: I hate steadicam because people think it is secret sauce to make their shoot better but are completely ignorant of its weaknesses. There is one other thing I don’t like about steadicam, and that occurs even when people understand its weaknesses, and that is the urge to do the “trick shot” which is an exercise in “look what I can do” rather than filmmaking to drive the story. Sometimes you can do a trick shot and it move the story at the same time, and people like me can enjoy both aspects of it. But showing off you know how to use a tool doesn’t mean you have made a great story.
Flying camera platforms may not be Steadicams but they might as well be. They do and will give you shots that otherwise would have been at the very least difficult, or possibly impossible before. And those shots have the potential to be amazing. And just like Steadicams, people will misunderstand and assume that as long as you use a drone the shot will then therefore automatically be amazing. Misunderstanding the tool you are using will result in wasted time, frustrated crew, and mediocre filmmaking, just as it alway has. But with drones there are two new aspects. One, is something that is generally happening with all gear in the industry, something that optimistically is called the “democratization of filmmaking” but on a practical terms means that good working gear can be purchased for prices more approaching a car than a house. There is good and bad with this, but one side effect is there are a lot more players in the market. Generally this shakes out as those who have skill, or have the potential to learn skills adeptly end up on top, as it has always been, but without money being a gatekeeper that it once was, which means the entry level of the market is crowded, like the beginning of a marathon.
Drones, especially seem to fit this category. A few years ago the technology just wasn’t even there to make a working drone at any cost. Now parts and information is out there that a professional rig can be built from parts ordered online at a very reasonable cost. In fact turnkey solutions even exist under $800. Back in the 90’s a Steadicam probably cost upwards of $40,000-60,000, and that didn’t include the camera, just the platform. So, a lot more people are getting into drones than ever were into Steadicam. Drones are so new that there are no “old hands” at it. Everyone is at the start of the marathon and it’s crowded.
The other new aspect to drones as a camera platform is the safety issue. This is what really makes me dislike them. Back in the day, a careless Steadicam operator could possibly hurt themselves, damage their rig and the camera, and possibly the nearest person, be that an assistant or actor, although this was quite rare. I know of no stories of this happening directly, although I always think of the emergency ripcord on the Steadicam vests of guys I would assist for, which when pulled would cause the vest to split open and fall away, allowing the operator to shed the rig in seconds in case of a catastrophic event like falling into a large body of water with 80lbs of gear strapped to them. Again, I never heard of anyone having to exercise that option, but it was there.
Drones, on the other hand often are 20-40 pounds of flying danger, often with eight very high speed sharp rotors being driven by high energy high capacity lightweight batteries, all being controlled by wireless control. Often built from scratch by the operator. Some of them have fail safes, where if wireless control is lost, they will return to original launch site and descend. That’s great, but only if those automated systems are solid. Again, many of these things are being built from scratch, and the code being written or at least tweaked by the builder. If the drone loses flight stability be it from a large gust of wind, operator error, or hardware or software malfunction, you have a potentially lethal falling object that can kill you and others by either just plain blunt trauma 20lbs falling on your head, or cutting you open with its eight high velocity Ginsu knives it uses to fly, or burn you when one of the high capacity batteries rupture and spew a jet of flames and energy. Look on YouTube and you will find several UAV/ drone failures, often triggered by a gust of wind, and possibly complicated by navigational hazards like nearby buildings the drone can hit on its way down so that its structural integrity is compromised well before it hits you. Now imagine that the price of entry is so low, people with only a passing interest get into it. Before you know it the sky is dark with flying lawn mowers being driven by mediocre do it yourselfers, who think they have the secret sauce to awesome filmmaking.

This is an evolving topic, and the good news is that there has been some attempt to regulate them in a way I approve. Up until recently there was a big question mark on whether all kinds of drones were illegal, and where the FAA stood on it. It was like the Wild West. It seemed like before the rules got codified, it was “anything goes” approach which seems very dangerous to me.
Making them illegal seemed untenable. They were so cheap and offered the allure to so many people, enforcement seemed almost impossible. Also if they are illegal, there would be no regulatory control on them. Just this month the FAA has been authorizing individual companies to be certified for flight, excepting them from normally required regulations as long as they fit a certain category of flight, including flying only over a “sterile” environment, i.e. the controlled set. Licenses, permits and special rules are the way to go. And prosecution of those who refuse to play by the rules. Individual drone operators need to apply for “certification” in order to be legal. This is because the technology is cheap, readily available, and dangerous.
Drone camera platforms need to be safe, legal, and somewhat rare. I don’t hate drones, as much as I hate the idea of people flying homemade unregulated rigs over my head because that will somehow make the shot “cool.” By making them sensibly regulated they then will (in most cases) be operated by sensible, trained operators, and only when they are the appropriate tool for the job.

P.S. don’t get me started on Movis or other gimbal handheld systems.

ProRes Camera Test: BMPCC, F3, F5, and RED Epic MX

Recently I decided to test some of the cameras I often use and or have access to.  This included the Blackmagic Pocket Cinema Camera, a Sony F3, a Sony F5, A RED Epic MX.  Obviously these cameras record natively in different codecs and resolutions, but I decided to even the playing field somewhat by using an external recorder (A Convergent Design Odyssey 7Q) so that everything is recorded in ProRes HQ, except for the Pocket Cinema Camera, which both records ProRes HQ natively, and also only outputs via HDMI (Highly Dodgy Media Interface) which I hate and prefer not to use. In all the tests I used the same lens, a RED 18-50mm short zoom, which was PL mount so it fit on all the cameras, which all had PL mounts on.  Only for the Pocket Cinema camera did that significantly change the filed of view, as it is more of a super 16mm sensor size compared to 35mm sensor size of the others, which I compensated by zooming out to approximate the same field of view.  The RED Epic of could natively can record up to 5K, but in this case I was just recording the 1080 output from a 4KHD recording, as 4KHD most closely matches 35mm format size.  Also I did not record LOG on the F3 or F5, or output REDLog on the Epic.  I did do both “Film” Rec and “Video” modes in the Pocket Cinema camera, but I did not run them through Resolve and color correct them as I might if I were actually using the camera.  I wanted to look at the footage without doing any alterations other than what I might do in camera.  For the F3 and F5 which have scene files that affect the look of the camera, I arbitrarily picked Abel Cine’s JR45Cine scene file for both, anticipating that those would be the scene files I would use were I to shoot with these cameras.  For the Epic, I usually shoot with saturation set to 1.2 rather than the default 1.0, and that is what I did here. White balance was preset 5600 for all cameras. it was a somewhat unsettled day weather-wise, so take any ability to hold detail outside the window with a grain of salt as these shots were not all done simultaneously, as I was using one lens for all the cameras, and I had only two tripods in play.

Click on any below for full frame.

No surprises here.  Epic and F5 look the best.  the F5 skin tones look a bit too red, but the whites look a little truer I think. the Black Magic Pocket Cinema Camera looks like it needs to be graded some whether you shoot video or film mode, and therefore makes me think if I am going to have to grade it anyway to make it look its best, I would shoot in “Film” mode in the future and take it through Resolve.

Below are enlargements of the focus target. again, no surprises.  Sony F5 and RED Epic are the best for resolving and lack of moire.  I was surprised slightly at the F3 moire, and not surprised but slightly disappointed on the Black Magic Pocket Cinema Camera’s sharpness performance. Again, this is off a screen grab from the ProRes HQ 1080 recording.

If I were to pursue this test further, I would explore scene files in the F3 and F5 to see if I could get truer skin tones out of it while maintaining their ranges.  The RED I might play with white balance a little bit, maybe a little lower to get rid of some of the tones that seem a bit too warm.  My next test I had planned though was to test my RED Epic after it had it’s sensor upgraded to the “Dragon” sensor.

My Love/Hate Relationship with Apple and Straying Outside the Walled Garden

The Unrest

A few months ago I was in the market for a small tablet. I was feeling a little claustrophobic in the apple world, as my laptop, my phone, my wireless router, etc were all Apple products. I had heard some good things about the new Android operating system. The family has an iPad2 wireless only that mostly lives in the house. I was annoyed that Apple doesn’t put a GPS in the wireless only version of the iPads, this meant that when we took the iPad out on a road trip and tethered it to my phone, the maps program didn’t work a worth a damn. Really, Apple? Would that have been so hard? Other Android wi-fi only tablets do that and come in cheaper. One that rose to the surface was the Google Nexus 7. Hell, if you are going to jump ship from Apple, Google seemed a good bet. Now, they don’t make the tablet, Asus does, but Google writes the code for the software on the Nexus 7. Ok, looks good. Better resolution than the iPad and a narrower profile (almost giant iPhone like) and more open source. This was also big draw. I have felt that tablets are artificially handicapped. Why can’t I use them as a phone? my over 40 eyes would love a giant iPhone. And I have no shame, I would absolutely hold an a tablet to my head to make a phone call. Why not? It’s not like talking to no one and waving my arms like a crazy person looks any better, as people with Bluetooth ear buds do.

Fine. I did the research and settled on a Nexus 7 with both wi-fi and cell service. Ordered it, got it set up from AT&T and settled in buying all the apps that had sister apps over from the Apple world. I was able to download apps so I could make free phone calls to and from my Nexus using Google voice. Very cool. And of course I had Google maps, not the travesty that Apple maps has made for itself. (I once had apple maps insist I drive into Boston bay to get to the waste treatment facility, which while on the coast is attached to the mainland.) Great. Maybe I had the new “One Device” it could be my phone, my GPS, my calendar, etc, and it was big enough to use and just small enough I could, depending on what I was wearing, get it into a pocket. Now all I had to do is learn the operating system.

The Bumpy Ride

Android OS is…. interesting. There would be a few cool new things, then some inexplicable dumbness. I was able to get apps that synced all the calendars on my iphone and Mac Book Pro, as well as my contacts. Not too hard. So far so good. The Nexus being a Google product it tries really hard to shove Gmail down your throat. Ok fine. I Set up a Gmail account. But around here I realize I have left the serenity of the “walled garden” that Apple provides. There is a Gmail only email app on the Nexus that seems nice enough, but that is not my primary email, so I am never going to use that app. There is also a generic email app but it seems to be really limited in functionality. After doing some research, I determine that “Aquamail” is the best most flexible powerful email app out there. I get it, set it up for my multiple email addresses. It has lots of “under the hood” settings about layout etc, most of which were bad. I kept thinking “this tablet is bigger than my iPhone, why is it harder to check my email on this thing than it is on that?” everything was busy, hard to read, hard to keep track of emails, etc. after much futzing I got it to where I was “OK” with it, but not really happy. Plus Aquamail is made by some Russian developer, and I couldn’t get out of my mind that he could have put a back door in it and now some Russian hacker has access to all my emails. Also now I have 3 different email apps all not great all checking emails and downloading them. Battery life suffers. I mess around with various settings and improve it somewhat.

On the Road

I have a big trip planned where I will drive up the east coast, vacation with my family, then bus and fly back to work, then fly and bus back and rejoin the vacation already in progress. I figure it is going to be a big “sea trial” for this new Nexus 7. We drive up the east coast using the Nexus 7 as our GPS. Even plugged in it cannot keep up with the power demands of the tablet. Battery levels drop during the whole day. If we had driven another hour we would have lost navigation as it couldn’t charge itself fast enough. Something has got to be wrong I think. I jigger with email fetch settings, and some other settings. The next day it seems a little better, although we don’t drive as far. Not good. One of the allures of a tablet is longer battery life than my phone. I know GPS use taxes the thing, but IT WAS PLUGGED IN. into a 2 ah USB port so it should have been fine.

During my vacation I decide the following things: although a bit busy, I like widgets. I like Google Chrome insofar as it integrates with the GPS of the tablet and gives you a welcome web page with info about things to do in your area and tracks travel plans as long as the tickets were booked via your Gmail account. Voice recognition is surprisingly good. Battery life remains poor. And surprisingly, I miss having a front facing camera. The Nexus has a back facing camera, but ships with no apps to use it, which is odd. Of course Skype and other apps can use it, so it is not like it is useless.

Airplane mode

I then bus then fly back to DC. On the bus I read “Treasure Island” a free ebook installed on the Nexus to get me hooked on that feature. I do this because I couldn’t figure out how to download a movie via the “Google Play” store. Turns out you can do it, but the setup is counter-intuitive. When I get to the airport I have several hours to kill, so I decide to watch a movie, streaming “Zero Dark Thirty” via the 3G network. Taking a huge hit in my monthly use, but whatever, I chalk it up to the learning curve. On the plane I go back to reading the e-book. The rest of the travel is uneventful. “Zero Dark Thirty” excepted, I finally realize that by not buying an Apple product not only am I locked out of the “iTunes” library I am locked into the Google library, which is pretty weak. Movie selections are not great.

Anyway, during my stay in DC I figure out how to download movies, rather than stream them, so I download two for viewing on my travel back. I watch part of one on the plane. When we land I turn my tablet out of “airplane mode” and that is where the trouble begins. The screen starts blinking and flashing at first I assume as all the apps come online and try to call out to the world. I deplane. At baggage claim the tablet is still stuck in some sort of subroutine, and unresponsive. I reboot the thing, thinking that will fix it. It takes an extremely long time to boot. In fact it never finishes. It just gives me the Google Nexus logo which in this tablet is suspiciously like a giant “X”. I restart the boot. Nothing. I start using my iPhone to look up how to troubleshoot a Nexus. I boot in safe mode. Still hangs. I do more research. My bus comes. The entire 2 hour bus ride I spend killing my iPhone battery trying to troubleshoot, chat w Asus technicians etc, on how to get this thing running again. I finally pull the nuclear option and opt to wipe all my data on it. This does not fix it. Let me repeat, in just outside of two weeks, without downloading any exotic apps I manage to brick my Nexus to the point where even a hard reset cannot save it. the only hope apparently is to connect it to a PC. A PC?! That is it. I am done. I get an RMA from both Asus and B&H, which is where I bought it. I get my money back. And start over.


Nexus “X” of death

The Aftermath

It turns out the Nexus going belly up was a blessing in disguise. I ran out and bought an iPad Mini no longer upset about the price or feeling claustrophobic living within the iOS environment. It was like coming home to an old friend. As Steve Jobs said “It just works.” No buggy email apps, good battery life, excellent library in iTunes, nice size. I have even come around to apple maps, as it seems to have improved somewhat. The one thing I lost is the ability to call in or out like a phone on the mini. Some version of this can be done i think by jailbreaking it, or maybe calls within wi-fi area only. I have found I don’t care so much, as long as everything else works. My one regret is not making one phone call on my Nexus 7 in a public place when it was still working, preferably surrounded by hipsters in a bar or coffee-house, to see the ensuing confusion in their eyes as to whether it was lame or cool.