24 fps: Where Does It Come From?

Back in the day (turn of the last century) there was no such thing as camera batteries or sound men. Men were men and cameras were hand cranked. As they had evolved from still cameras they were sort of still camera Gatling guns, capturing still frames as fast as you cared to crank. Somewhere past 14fps something magical happened and persistence of vision started to fuse the images so rather than a fast slide show it started to look like motion. So cameras were built to move one linear foot of film per two cranks, which meant if you cranked at “coffee grinder” speed you hit 60 feet per minute, which comes out to 16fps, just north of that 14fps effect. Cranking faster improved the persistence of vision thing, but producers didn’t like you blowing through all that expensive film, and besides there was really only one stock available and it was slow, about 24asa. Cranking faster meant less light per frame. Sometimes you cranked even less than 14fps to squeeze a bit more exposure out of it.

This is where it gets a little weird. Projectionists, who were also often hand cranking their projectors had a habit of cranking faster. Faster meant faster turnaround in seating, which meant more $ and even better persistence of vision without annoying flicker. Sure, action was sped up, but the whole thing was new, no one seemed to complain. In fact, by 1925 The Society Of Cinema Engineers (now known as SMPTE) had codified it recommending

60 feet per minute (16fps) for camera speeds and projecting at 80 feet per minute (21.3fps) seems weird now to pick a different speed for display from capture, but to review, faster cameras cost more money, and faster projectors made money, and after all, producers are paying for everything.

proposed standard cranking and projection speed circa 1927 from SMPE

proposed standard cranking and projection speed circa 1927 from SMPE

Anyway, someone decided it would be a great idea to add sound. How hard could it be? In fact, several companies tried to be the first to bring sound to the movies, hoping to capture the market. Funny thing is they all insisted on capturing at the same frame rate they displayed at. If you didn’t, the pitch would be all wrong and everybody would sound silly. And forget about music. Some picked 80 feet per minute (the already established speed for projection), some picked 85 feet per minute, and some picked 90 feet per minute. First one to get a working system was Warner Brothers Vitaphone. It was used in the 1927 “The Jazz Singer” which was the first feature length film with sync dialog and is considered the official start of the “Talkies.”

IMG_0582

Western Electric’s Bell Telephone Laboratories (and their Vitaphone system) as well as other systems listed taking speed and projection speed (SMPTE 1927)

 

The Vitaphone engineers had picked 90 feet per minute, or 24fps as their capture and projection speed. If one of the others had been first, we easily could be shooting 21.33fps or 22.66fp as a standard today. So sometimes you get lucky.
Except the Vitaphone system was terrible. It sounded good but that’s all that could be said about it. The sound was recorded on 16″ disk records separate from the film. They could only be played 20-30 times before they were no good, and they could break, so you had to send lots of duplicate disks with each roll of film to the projectionist. A disk only covered one reel so every reel change you at to cue up another record. And synchronizing the needle with the head of the roll was a pain in the ass. And if you broke the film for some reason and spliced it back, everything past that point was out of sync. During recording, the camera had to be motor powered from the mains, and the disks had to be made in recording booth adjacent to the set. In fact it was such a bad system that it was abandoned 5 years after it was implemented. And it only lasted that long because all the theaters that wanted to have sound had bought into that technology and had these crazy phonograph contraptions connected to their projectors and weren’t eager in throwing them away just after having bought them. Movietone, which used technology that put the audio as an optical track on the film had many advantages, but it was a little late out of the gate. Because Vitaphone was first, the engineers of Movietone decided to match the Vitaphone frame rate.

“Originally we recorded at a film speed of 85 feet per minute. After Affiliation with the Western Electric Company, this was changed to 90 feet per minute in order to use the controlled motors already worked out and used in the Vitaphone system.  There are a large number of both Vitaphone and Movietone installations scheduled and in operation, and sufficient apparatus is involved to make it impractical to change the present practice of sound reproducing.  In connection with the Society’s standard, I have been unable to find any New York theater which is running film at 85 feet a minute; the present normal speed is 105 feet and on Sundays often 120 feet per minute is used in order to get in an extra show”

Earl L Sponable, Technical Director, Fox-Case Corporation, New York City (“Some Techincal Apects of the Movietone” S.M.P.E. #31 September 1927, Page 458)

Soon enough Movietone lost ground as well, as technology changed but all subsequent sound systems stuck with the now established 24fps. So blame a sound man. Or thank him. Your choice.

Vitaphone

One of the first sound men checking a Vitaphone recording with a microscope while recording. Sort of a human playback head. (page 308 from Transactions of S.M.P.E. August 1927)  It turns out this man is George Groves.

 

Postscript: Now of course, we often mean 23.976 fps when we say 24 fps.  This one we can’t blame on sound.  23.976 fps as a camera frame rate can be blamed on the introduction of color to standard-def television broadcasts in the 1950’s, and the death of film as a capture medium, and by extension the death of telecine as a post process.

When TV started, it did not match the 24 fps established by film.  This is because engineers wanted to use the 60Hz cycle from our 110v 60Hz household power to drive frame rate.  60Hz meant 60 fields, or 30 frames per second, and was pretty easy to implement.  Once color came along in the 1950’s they wanted a standard that would be backwards compatible with black and white TVs.  Engineers could no longer use the 60hz rate of the household electricity to drive frame rates and keep the color and luminance signals to play nice so they settled on a very close one of 59.94hz.  This resulted in a frame rate of 29.97 fps, from the previous 30 fps, something the black and white receivers would still work with.

Telecine: in order to get film onto TV you had to do a step called telecine.  the film was played back and captured essentially by a video camera.  Getting 24 fps to fit into 30 fps was done via a clever math solution by what is called 3:2 pulldown.   There are two fields to a standard-def frame, and thus 60 fields per second, and 3:2 pulldown would use one film frame to make three fields (1.5 frames) of video.  Then the second film frame made two fields (1 frame) of video, and the third frame made 3 fields again and so on.  Doing this, 24 fps fits quite nicely into 30 fps broadcast.  and anything shot 24 fps but shown at 29.97 fps system would look like it had been shot at 23.976 fps, even though the camera had been running at 24 fps, as anything that ran through the telecine went through a 01% slowdown to conform to the 29.97fps broadcast standard.  Somewhere in the transition to High Definition 23.976 became codified as a standard, not only for broadcast, but a capture speed. As cameras more and more were digital and not film, they would choose 23.976 as the actual camera frame rate, rather than 24fps and expect the 0.1% slowdown to happen upon transfer from film to video, as had happened to film in telecine rooms.  No telecine? no slowdown, which meant it had to be implemented in actual camera speed.

So, hate 23.976 fps? blame a Sound Man, color TV, the death of film and the whole accidental way we pick our standards.

 

For those interested in reading more, I highly recommend reading online records of the Journal of Society of Motion Picture Engineers, made available by the Media History Project. http://mediahistoryproject.org/technical/

 

 

Why I hate UAV copters

Drones. UAVs. Octocopters. Call them what you want. They are the new disruptive technology in a lot of applications, but I am specifically going to talk about them as they apply to my industry, as a camera platform for dramatic, narrative, or commercial work. You can see the allure- it lets you get shots that otherwise would be difficult or in some cases impossible via traditional methods. And camera movement is the best way to add production value to your shoot.
And I hate them. I hate them like I hate steadicam. What’s that you say? Hate steadicam? What kind of Luddite or backwards filmmaker are you? Let me explain. I do not hate the steadicam device per se, and I completely agree that steadicam allows for shots that could not be obtained any other way. I might even be convinced to use one some day. Here is what I hate about steadicam. People act like it is the solution to everything and will make everything awesome. A Steadicam is not awesome sauce you get to spread all over your shoot. It has weaknesses, just like any camera platform. Let’s review. A Steadicam can not provide a stable horizon on a static shot,
especially after it has been moving, which is why you use Steadicam in the first place. There are operators that can mitigate this, but it is inherently difficult on this system, yet directors insist on designing shots completely blind to the weaknesses of the platform and Steadicam operators struggle to make the shot work.
Another misconception is that steadicam systems are fast. Tracks don’t need to be laid, it has the freedom of handheld, you can just go. The fact is often quite the opposite. Steadicam can cause the shoot to slow way down. First of all, the whole system of a steadicam requires that the rig be balanced. This means a lens change, an addition of a filter, adding a timecode box, all require time out to balance the rig. If you are dealing with a shoot with only one camera body, to go from tripod to Steadicam can be a very involved process and ties up the camera during that process.
Once you have the camera balanced and on the rig, another thing to keep in mind is that a Steadicam rig with a camera on it is quite heavy. Between the camera, the post, the counterweight, wireless transmitters, arm and vest, it can tax the best operator. This means the operator needs to park it on a stand or docking station when not actually executing the shot. This makes blocking and lighting the shot a bit more difficult as it is best done while the operator is wearing the sled, which you want to keep to a minimum to keep him or her fresh.
Also, as the camera has potentially 360 degree movement, lighting can be a challenge. Nowhere is safe, and lights need to either rig directly in the ceiling if possible, be hidden somehow, or travel with the camera. Again, all this can be done, but none of it in the category of “fast.”
So, lets review: I hate steadicam because people think it is secret sauce to make their shoot better but are completely ignorant of its weaknesses. There is one other thing I don’t like about steadicam, and that occurs even when people understand its weaknesses, and that is the urge to do the “trick shot” which is an exercise in “look what I can do” rather than filmmaking to drive the story. Sometimes you can do a trick shot and it move the story at the same time, and people like me can enjoy both aspects of it. But showing off you know how to use a tool doesn’t mean you have made a great story.
Flying camera platforms may not be Steadicams but they might as well be. They do and will give you shots that otherwise would have been at the very least difficult, or possibly impossible before. And those shots have the potential to be amazing. And just like Steadicams, people will misunderstand and assume that as long as you use a drone the shot will then therefore automatically be amazing. Misunderstanding the tool you are using will result in wasted time, frustrated crew, and mediocre filmmaking, just as it alway has. But with drones there are two new aspects. One, is something that is generally happening with all gear in the industry, something that optimistically is called the “democratization of filmmaking” but on a practical terms means that good working gear can be purchased for prices more approaching a car than a house. There is good and bad with this, but one side effect is there are a lot more players in the market. Generally this shakes out as those who have skill, or have the potential to learn skills adeptly end up on top, as it has always been, but without money being a gatekeeper that it once was, which means the entry level of the market is crowded, like the beginning of a marathon.
Drones, especially seem to fit this category. A few years ago the technology just wasn’t even there to make a working drone at any cost. Now parts and information is out there that a professional rig can be built from parts ordered online at a very reasonable cost. In fact turnkey solutions even exist under $800. Back in the 90’s a Steadicam probably cost upwards of $40,000-60,000, and that didn’t include the camera, just the platform. So, a lot more people are getting into drones than ever were into Steadicam. Drones are so new that there are no “old hands” at it. Everyone is at the start of the marathon and it’s crowded.
The other new aspect to drones as a camera platform is the safety issue. This is what really makes me dislike them. Back in the day, a careless Steadicam operator could possibly hurt themselves, damage their rig and the camera, and possibly the nearest person, be that an assistant or actor, although this was quite rare. I know of no stories of this happening directly, although I always think of the emergency ripcord on the Steadicam vests of guys I would assist for, which when pulled would cause the vest to split open and fall away, allowing the operator to shed the rig in seconds in case of a catastrophic event like falling into a large body of water with 80lbs of gear strapped to them. Again, I never heard of anyone having to exercise that option, but it was there.
Drones, on the other hand often are 20-40 pounds of flying danger, often with eight very high speed sharp rotors being driven by high energy high capacity lightweight batteries, all being controlled by wireless control. Often built from scratch by the operator. Some of them have fail safes, where if wireless control is lost, they will return to original launch site and descend. That’s great, but only if those automated systems are solid. Again, many of these things are being built from scratch, and the code being written or at least tweaked by the builder. If the drone loses flight stability be it from a large gust of wind, operator error, or hardware or software malfunction, you have a potentially lethal falling object that can kill you and others by either just plain blunt trauma 20lbs falling on your head, or cutting you open with its eight high velocity Ginsu knives it uses to fly, or burn you when one of the high capacity batteries rupture and spew a jet of flames and energy. Look on YouTube and you will find several UAV/ drone failures, often triggered by a gust of wind, and possibly complicated by navigational hazards like nearby buildings the drone can hit on its way down so that its structural integrity is compromised well before it hits you. Now imagine that the price of entry is so low, people with only a passing interest get into it. Before you know it the sky is dark with flying lawn mowers being driven by mediocre do it yourselfers, who think they have the secret sauce to awesome filmmaking.

This is an evolving topic, and the good news is that there has been some attempt to regulate them in a way I approve. Up until recently there was a big question mark on whether all kinds of drones were illegal, and where the FAA stood on it. It was like the Wild West. It seemed like before the rules got codified, it was “anything goes” approach which seems very dangerous to me.
Making them illegal seemed untenable. They were so cheap and offered the allure to so many people, enforcement seemed almost impossible. Also if they are illegal, there would be no regulatory control on them. Just this month the FAA has been authorizing individual companies to be certified for flight, excepting them from normally required regulations as long as they fit a certain category of flight, including flying only over a “sterile” environment, i.e. the controlled set. Licenses, permits and special rules are the way to go. And prosecution of those who refuse to play by the rules. Individual drone operators need to apply for “certification” in order to be legal. This is because the technology is cheap, readily available, and dangerous.
Drone camera platforms need to be safe, legal, and somewhat rare. I don’t hate drones, as much as I hate the idea of people flying homemade unregulated rigs over my head because that will somehow make the shot “cool.” By making them sensibly regulated they then will (in most cases) be operated by sensible, trained operators, and only when they are the appropriate tool for the job.

P.S. don’t get me started on Movis or other gimbal handheld systems.

How to Quiet a Noisy Dragon

My last post about my quick test with the new Dragon sensor had a bit of a surprise with the Dragon footage looking noisy, especially compared to the previous non-Dragon MX chip.  RED suggested that a fix was on the way, and soon.  They hinted at around a week.  That was June 21st.  My rule of thumb for RED target dates for delivering a product is: take the stated time period, double it, and add two months.  So, in this case that would mean somewhere around the first week of September. Well, RED beat the odds, while still completely missing their target of a week, and August 6, 2014, a mere 7 weeks, they released the fix.   The fix is a different way to debayer the RAW footage, selectable in their new beta release of REDCINE. The feature is called “DEB” or “Dragon Enhanced Blacks” although it could easily be called the “Anti Red Speckle Filter” as it gets rid of the red noise. it is a checkbox you select right below the Gamma settings in REDCINE.

DEB checkbox just below Gamma Settings in REDCINE

DEB checkbox just below Gamma Settings in REDCINE.  Here it is not selected, the current default.

The good news is it is retroactive and can be applied to footage you have already shot.  This is great for my purposes as I don’t have t re-do the test.  Here are some quick screen grabs.

same shot from prior test, with DEB allied and not.

same shot from prior test, with DEB applied and not.

DEB dragon Vs Epic-X

RED Epic MX on left, Lower right is Dragon with DEB applied.

So it definitely improves things. Further testing is warranted though. I hope to test it against a different camera, like Arri Alexa or Amira in the future.  RED is talking about making user swappable OPLFs but no time frame on that yet.  Once they do put a date on it, don’t forget to double it and add two months!

Epic MX vs Epic Dragon.

I just recently got my RED Epic back after RED installed the new “Dragon” chip.  I borrowed an Epic that still had the MX chip and shot side by side tests to see how much better the Dragon chip actually was.  I found a few surprises.

First off, I updated both cameras to the current release builds 5.1.51 and black shaded both of them after cameras had reached operating temp at adaptive at 65c.

IMG_4748the weather was a bit unsettled (as it always seems to be when I seem to have time to do these tests) so I decided to put on a 35mm Red Pro prime on the Epic MX and a 50mm RPP on the Dragon and move the cameras a bit to mimic same frame size of midground so I could roll both at the same time so exposure would be identical.  I set them for the same stop, which if I recall was something like f 8 1/3.  according to the false color overlay the Dragon had more info in the highlights before clipping.  In fact a small patch of white sky which clipped on the MX was apparently not clipping in the Dragon. I shot myself in my garage workspace that has diffused top light and a nice window for testing overexposure.  If I shoot in the afternoon there is a piece of an apartment building across the street that gets hit with sun, providing an excellent detail/ overexposure test element.  People have complained in the past about RED’s rendition of skin tones.  I think this is wildly overblown.  I have never had an issue with it, at least under daylight color temps, which is how I shoot the majority of my stuff.  The Dragon is supposed to be much better.  I did not notice much of a difference, nor did I see any problems with the old one.  One note is that with the new chip comes some new settings in REDCine, the app that lets you “develop” the Raw footage from the camera.  before Dragon there was “Redcolor 3” for color rendition, and “REDgamma 3” for gamma.  With Dragon there now is “Dragon Color” and “REDgamma 4.”  while I think you should use Dragoncolor for rendition off a dragon chip, don’t be fooled into thinking REDgamma 4 is automatically better.  It tends to be a bit crunchier than REDgamma 3 which under normal circumstances helps make a punchier image, but when testing over under exposure range, may fool you into thinking the Dragon chip has even less range than the Regular Epic.  Of course, professional graders probably would just use REDlogfilm (which is flat and holds onto the most range) and make their own curve based on the scene. but in this case I wanted to do a relatively unbiased side by side test.

screenshot from REDcine.

screenshot from REDcine.

Below are some interesting screen grabs:

COLOR RENDITION:

Side by Side Dragoncolor Vs REDcolor3

Side by Side Dragoncolor Vs REDcolor3

Dragon on the right, Epic-MX on left. Dragon set to Dragoncolor Epic MX set to REDcolor3  More detail in Dragon, and slightly less red tint to the skin.

HIGHLIGHTS:

Now here is a detail of the highlights: Dragon on the right, Epic on the left.  Both set to REDgamma3. slightly more highlight detail on the Dragon chip as expected.

RG3 Dragon vs MX
Don’t be fooled into using REDgamma 4 just because it is better.DRAGON RG4 VS EPIC RG3 Here is the same footage

but Dragon chip is rendered at REDgamma4.  It is very hard to tell which chip has the advantage in this scenario.brick looks about the same, maybe slight advantage Dragon, but leaves look hotter on the Dragon.

Now here is the same thing in REDlogfilm. You will see the Epic MX chip clips magenta showing the white is clipping.  Not so in the Dragon. as you might expect.

Epic-MX vs Dragon in REDlogfilm

Epic-MX vs Dragon in REDlogfilm

BIG SURPRISE: NOISE LEVEL:

This is Epic MX on left and Dragon on the right

This is Epic MX on left and Dragon on the right. both at 5600K, 800 ISO, 8:1 compression.  Dragon at 6KHD and Epic-MX at 5KHD.

 

This was the big surprise. The Dragon Epic was sharper, as was to be expected as it’s max resolution was better than the MX (6KHD in this case vs 5KHD) but there was more noise and pattern in the Dragon footage.  This was alarming as a year ago the Dragon chip had been advertized as 2000 ISO was as quiet as 800 ISO in the MX chip. What the hell? I paid over 8 grand to get MORE noise?

I went over to REDUSER.net and fund someone else had done essentially the same test as I had done and got the same results. if you are into flogging yourself, here is the link:

http://www.reduser.net/forum/showthread.php?117701-DRAGON-vs-EPIC-MX-Noise-NOT-GOOD

CONCLUSION:

Anyway the long and short of it was this: It is the new OPLF, along with black shading that is to blame.  When Dragons first started shipping to anybody (maybe December 2013?) they had what I will call version 1 of the OPLF (Optical Low Pass Filter) which is essentially a piece of custom class/filter in front of the sensor proper to improve performance, reject IR and limit moire.  All cameras have some kind of OPLF.  after logging some hours with it in the hands of users the following characteristics were found.  It did in fact seem pretty clean at 2000 ISO.  But there were weird magenta flares, which were very visible in low light situations.  A new OPLF was designed that got rid of the magenta flare, improved highlight performance by at least a stop (I have no idea how they did that, possibly in conjunction with new black shading algorithm?) and even better IR performance.  Downside? now 2000ISO is noisy.  And the whole thing blew up on REDUSER once everyone was getting new Dragons in numbers with the new OPLF.

The good news is that RED will, if you wish, put the old OPLF back in your camera, if that is what you wish.  And they claim to have a firmware build in the works (Don’t they always)  that will tweak black shading implementation that will address this problem, and that it will come out soon, possibly in a week.  But knowing RED that could mean a month.  They claim that the new OPLF brings so much to the table that they think the future is in the new OPLF with software tweaks. There even has been talk of user replaceable OPLFs in the future. I can confirm that the IR contamination on this new arrangement is quite good.  I had a shoot with an n1.2 and pola (6 stops of ND essentially) and a black dress Marine uniform in full sun and the uniform stayed black.  unfortunately I had shot it at 2000 ISO because I had believed that 2000 was the new 800, and that shooting at a higher ISO protected your highlights more. This had been true of the RED One, but is no longer true if RED Dragon.

Also, I rendered both the Dragon and the MX footage to a 1080 ProRes file.  the noise difference was indiscernible, which was a pleasant surprise.  I did not use any special noise reducing options on either.  So for the moment I am content to wait for the new firmware and black shading.  Even if it makes all the testing above obsolete.  But as always, test test test.  Sometimes you find a surprise.

 

ProRes Camera Test: BMPCC, F3, F5, and RED Epic MX

Recently I decided to test some of the cameras I often use and or have access to.  This included the Blackmagic Pocket Cinema Camera, a Sony F3, a Sony F5, A RED Epic MX.  Obviously these cameras record natively in different codecs and resolutions, but I decided to even the playing field somewhat by using an external recorder (A Convergent Design Odyssey 7Q) so that everything is recorded in ProRes HQ, except for the Pocket Cinema Camera, which both records ProRes HQ natively, and also only outputs via HDMI (Highly Dodgy Media Interface) which I hate and prefer not to use. In all the tests I used the same lens, a RED 18-50mm short zoom, which was PL mount so it fit on all the cameras, which all had PL mounts on.  Only for the Pocket Cinema camera did that significantly change the filed of view, as it is more of a super 16mm sensor size compared to 35mm sensor size of the others, which I compensated by zooming out to approximate the same field of view.  The RED Epic of could natively can record up to 5K, but in this case I was just recording the 1080 output from a 4KHD recording, as 4KHD most closely matches 35mm format size.  Also I did not record LOG on the F3 or F5, or output REDLog on the Epic.  I did do both “Film” Rec and “Video” modes in the Pocket Cinema camera, but I did not run them through Resolve and color correct them as I might if I were actually using the camera.  I wanted to look at the footage without doing any alterations other than what I might do in camera.  For the F3 and F5 which have scene files that affect the look of the camera, I arbitrarily picked Abel Cine’s JR45Cine scene file for both, anticipating that those would be the scene files I would use were I to shoot with these cameras.  For the Epic, I usually shoot with saturation set to 1.2 rather than the default 1.0, and that is what I did here. White balance was preset 5600 for all cameras. it was a somewhat unsettled day weather-wise, so take any ability to hold detail outside the window with a grain of salt as these shots were not all done simultaneously, as I was using one lens for all the cameras, and I had only two tripods in play.

Click on any below for full frame.

No surprises here.  Epic and F5 look the best.  the F5 skin tones look a bit too red, but the whites look a little truer I think. the Black Magic Pocket Cinema Camera looks like it needs to be graded some whether you shoot video or film mode, and therefore makes me think if I am going to have to grade it anyway to make it look its best, I would shoot in “Film” mode in the future and take it through Resolve.

Below are enlargements of the focus target. again, no surprises.  Sony F5 and RED Epic are the best for resolving and lack of moire.  I was surprised slightly at the F3 moire, and not surprised but slightly disappointed on the Black Magic Pocket Cinema Camera’s sharpness performance. Again, this is off a screen grab from the ProRes HQ 1080 recording.

If I were to pursue this test further, I would explore scene files in the F3 and F5 to see if I could get truer skin tones out of it while maintaining their ranges.  The RED I might play with white balance a little bit, maybe a little lower to get rid of some of the tones that seem a bit too warm.  My next test I had planned though was to test my RED Epic after it had it’s sensor upgraded to the “Dragon” sensor.

Jeffrey Brown’s eyeball

cu wideThe current almost beaten to death fad in cameras these days is trying to get a shallow depth of field, i.e. shallow focus look.  Of course people are crazy about it because it looks great for portrait stuff or when run by a professional crew that can maintain the critical focus throughout the shot.  But when budgets get tight, and someone whips out a Canon 5D instead of a true video or film camera, and isn’t properly staffed, you end up with constant hunting for focus and lots of blurry shots and lots of justification that “it’s a look.”

In controlled environments it is much less of an issue, or so I thought.  I recently shot some last minute promos for PBS News hour.  They were to be the various correspondents against black limbo delivering one line while standing. Budget was of course tight, but these were really just head shots, nothing too complicated.  The director was interested in using primes, as they tend to be sharper and tend to be able to give you shallower depth of field due to faster f stops.  Fine by me.  Of course, since it is black limbo, any shallow depth of field look you are after will have to play out exclusively on their face as there is literally nothing else to see in the frame. Mainly I was interested in the sharpness that primes could provide.  They were interested in both 4K acquisition, so we shot on my RED Epic camera using two RED Pro Primes, a 50mm and a 100mm. Again, nothing too complicated.  We set up the lighting, and had the talent come in and do their bit to camera.

At 100mm at just under f2.8 focus was razor-thin.  Even with the talent standing still, keeping it in focus was very hard.  I did not have a First Assistant, partially because of budget and because I had not pushed for one, as everything was static, and I had not seen a need for one.  I did have someone to deal with the data downloading, but no one skilled in pulling focus except me. I ended up staring at a 17″ monitor from as close as I could get, with my visual world shrinking down to the key side eyeball and trying to keep it sharp.  One consistent thing I noticed is that even standing still people will lean forward ever so slightly upon delivering a line, as a sort of body language emphasis.  It is something you don’t normally notice, unless you have less than a 1/4 inch of useable focus. About three people in we had Jeffrey Brown as talent.

A002_C007_0424ZZ.0002435 In addition to being a good journalist he has very distinctively slate blue eyes.  Everyone always looks to the eyes for focus, but with his eyes it seemed especially critical as they really “snapped” when in focus, which of course meant that they went soft, even for a moment everybody would notice.  And he was a “rocker,” leaning just a bit more than some of the other talent, making focus an all-consuming job.  At one point the director had to point out that the framing of the shot was going to hell, as I had ceased to be monitoring it, devoting all my time to keeping that damned eyeball in focus.  I apologized for the poor framing saying, “I was distracted by the eyeball” which got a few laughs, especially since Jeffrey Brown thought it was because it was because of his blue eyes, which he is somewhat sensitive to as he (rightly so) considers himself a journalist not a “pretty face” for TV.  We had to explain that it was mainly about keeping those eyes in focus on his slightly moving head that was the distraction.  The still below is a crop from the 4K image.

cu wide Just look and you can see the focus fall off from the bridge of the nose to his sideburns.  In fact it appears that I have the front of his eyeball in focus but the edge, being a sphere, was just that much further away that it was not in focus. and as I was shooting 4K or 4 times the resolution of HD there was nowhere to hide.  you had it or you didn’t.

eye ecuWe got through the day, but boy, it was a wake-up call for how hard even the simplest job can bring unexpected challenges.  No dolly shot no intentional talent movement, yet focus was the biggest challenge of the day.  We weren’t even shooting wide open.

Moral to the story: be careful what you ask for.  There can be a thing as too shallow a depth of field.  Always ask for an assistant. And 4K, and beyond can be very, very unforgiving if you get it wrong. And maybe Victorian head clamps may come back in to style.

Dispatches from the field; or how I baked my MBP motherboard in a ship’s galley.

Sometimes I work as a DIT (Digital Image Technician) and a Camera Assistant.  in the summer of 2012 I worked on a Discovery Shark Week commercial in the Bahamas.

red at shark week

Sounds like fun, right?  Anyone who has done anything like this knows the clear eyed view of this would be realizing you would be taking a bunch of sensitive electronics near large bodies of corrosive salt water, full of bloody chum and hungry sharks, off the coast of a foreign country, with little or no support or even cell service.  It easily could get ugly.

On top of it my Mac Book Pro had started to act up about a week before departure.  It was intermittent, and I was not ready to pull the trigger on buying a new machine for the job.  there were going to be several other laptops on the job which I could press into service, so I would not be completely SOL if it failed.  Of course my laptop was dialed in with all the drivers and verification software on it that the others might not have.  I did a little research and it appeared it was a problem with the motherboard/ video processor inside the MBP, which had finally been determined was a design flaw, which that meant even out of warranty they would replace the whole board for $300.  Great.  Only they needed 5-7 business days to turn it around.  I didn’t have 5 business days before my flight.  So I did some research.  I found the following link which was very informative.

http://russell.heistuman.com/2010/04/27/cooking-the-books-or-baking-my-macbook-pro-logic-board/

Anyway, the gist of it was that the solder on the motherboard had micro-fractures and needed to be re fluxed.  How do you reflux a motherboard?  well, in the field, you remove the motherboard from the computer, place it on tin foil balls on a cookie tray and bake for 8 minutes in an oven at 375 degrees.  You read that right.  Put your motherboard in an oven and turn it on, and leave it there for eight minutes.  Just like baking cookies. Ok.  Got it.  Anyway armed with this and the fact that my computer had begun to NOT act up I felt reasonably confident going to sea. I ordered some thermal paste from Amazon to have just in case.  It came two days later, well before my flight.

On the high seas

bug at shark week

Volkswagon Shark Cage loaded onto Dive boat

Day 1 at sea: weather is bad (choppy high seas) but laptop works flawlessly, despite less than ideal conditions of constantly rocking boat overfilled with our film crew and gear, an underwater dive crew and their gear, the on-camera dive talent and their dive gear, and the ship’s crew.  Plus cases and cases of stuff.  I had to stick the laptop under a bunk full of gear to shelter it from potential falling cases or it falling itself due to the high seas.  There was nothing I could do about the diesel fumes rolling in from the open hatch behind me though.  Bonine was doing a good job of suppressing my urge to puke all over everything, although it was akin to a sensation of Bonine holding the door shut against ugly ugly illness.  You know it is there trying to get in, but for the moment I was OK.  So far so good.

Day 2: Seas are rougher.  There is a Hurricane off of Florida and we are getting the edge of it.  Mostly no rain, but lots of high seas.  And the laptop starts locking up.  I have to give up and press a PA’s laptop into service.  We make it through the day, which gets called early due to the fact that the seas are predicted to be even worse tonight.  We steam back to port.

At port and into the oven

motherboard 2

MBP motherboard pre-baking

Day 3: we are in port, and standing down for the day.  Time to fix my laptop.  without seas lurching me 30 degrees in each direction I now feel ready to take out the dozens of tiny screws and whatnot that hold my laptop together.  I mean, it’s already dead right? Besides I had read all about this.  There is wifi in the harbor (miracles!) so one last look into the interwebs before I crack open the machine.

I get the screws varying sizes and shaped heads, then remove the mother board and all the various pins and ribbon cables.  I tell the ships cook to preheat the oven in the galley to 375 and ask if she has a cookie sheet and some tinfoil.  She asks me twice what we are cooking, mainly because she thought she misheard.  Had I done this before, she asked.  No.  but I read about it on the internet.

Good news: The ships cook advises me that since we are at port, we are using shore power, which is more reliable, which means we are more likely to actually hit 375 degrees reliably.  Good.  except that 375 is sort of a seat of the pants guess. at least we aren’t heaving back and forth like a drunken amusement park ride like yesterday.

motherboard

375 degrees for eight minutes.

Ok into the oven.  after eight minutes we take it out, and let it “rest” for 20 minutes.  once it is no longer hot to the touch, I grab it and try to remember where all those little ribbon cables and screws go.  Turns out there are two plastic bits I should have removed before the board went in.  they are receivers for some screws that hold the board to the shell.  They melted slightly. One still is functional, but the other no longer works.  No matter, there are lots of other screws holding it on.  I think I will be ok minus one screw. Trying to remember how to apply the thermal paste to the processors, I am generous.  I want the processors to conduct the heat effectively as the inability to dissipate heat was probably got it into trouble in the first place. I get the thing put back together, and with not a little bit of fear, boot it up.  It boots!  But one of the memory cards isn’t working.  No problem, power down and re-seat the memory.  Re boot.  Ok looks good… no wait, no wifi.  Ok I missed a ribbon cable.  take the whole thing apart again and find missing cable and attach.  reassemble.  re-boot.  It lives!  it works. But running Temperature Monitor I see my machine is running hot.  But now I can log into internet and look up how to properly apply the thermal paste.  Oops.  looks like you want to put the thinnest possible layer possible on, using a razor blade.  Ok back into the machine, take the whole thing apart, pull the motherboard again, scrape off the thermal paste and re-apply.  Re- assemble everything.  I am beginning to get very familiar with all the bits.  Re-boot again, and I think I am back up.  No overheating, no malfunctions, and only one screw left over.

I spend the rest of the day sitting on the stern (where the wifi is strong) ordering hundreds of dollars of lubes and cleaners and misc stuff for cleaning all my other gear after I get back home, away form all this corrosive salt water. and a new Gopro, as an overworked PA forgot to close the underwater housing before it went underwater. Oops.

Epilogue

Fast forward to today, and that machine is still working.  It runs perhaps a little hotter than before it’s surgery, but it’s days are numbered due to the speed of Thunderbolt and USB3 rather than any performance failures on it’s part.  It’s a shame because it has been a workhorse.  Even missing that one screw.