How To Load and Operate a Wilart, Part 1: Backstory

Me using the Wilart on a job in 2018.

I have had a Wilart 35mm hand crank camera for a few years now, given to me from the former dean of the film school (thanks Glen!) where I learned to make my way in this business.  It even came with a manual of sorts.  But as was the practice in those days, it was more like a sales catalogue, telling you how wonderful it was without actually going through details of actual operation.  I muddled through and managed to get it loaded with only a single picture of it threaded to go on, and ran some film through it to prove it still worked, and had no light leaks.  I would then occasionally practice, and do some research on the camera and also trot it out once a semester for students of my own to show them a 90 year old camera can still produce decent images, which is more than anyone will say for 90 year old digital cameras. (Once they reach that age!)

Anyway, I thought I had learned the basics of it’s operation and loading, and had even made some improvements, like making some 3D printed adapters that allowed modern cores to be loaded on the older style spindles.  I thought I more or less had it down.  Then I had an actual job come up where they wanted a hand crank camera in the mix.  Great! Finally get to use it for real, as it were.  But like many production jobs it came up suddenly, I had to travel back in town for it, and we essentially were relying on whatever short ends I had in the fridge as it was too short notice to get a few fresh loads from Kodak.

Well, let me tell you, it is a far different condition to load such a camera in a well lit space with no pressure versus in a live concert environment, where it is dark, and loud, and you have time sensitive material to shoot, with no re-takes and all you have are short ends and precious few mags to put those short ends in.

All in all it went well, at least when it absolutely had to, although at non-critical moments I had some jams, and other issues, but no show stopper problems.  Anyway, I thought, if I want this to happen again, I should formalize some notes to myself how to load the damn thing for the next time, so I can move faster.  Then I thought perhaps I could also share it, because, you never know, it could be useful to someone else, or at least interesting.

But first, some Cinema History, and where the Wilart fits in it:

(note: this is by no means a comprehensive list of hand crank cameras and events during this era, just the ones I find most interesting)

1889: George Eastman invents flexible celluloid film, as opposed to glass plates, paving the way for the development of motion picture cameras.

Charles Kayser of the Thomas Edison laboratory with an early version of the Kinetograph. (Photo from National Park Service/Wikipedia)

1892: Edison’s Kinetograph. Around 1892 Thomas Edison invents the Kinetograph, a camera created to make content for his “peephole” viewing device called the Kinetoscope.  both devices used “4 perf”   or four perforations per image, on both sides of the image area, and with a few minor tweaks regarding sprocket hole shape, is essentially the standard gauge and sprocket placement we use today.  But Edison didn’t get everything right.  His camera was driven by DC power (another of Edison’s inventions) and as such was more stationary than later cameras.  He even built a studio for it called “The Black Maria” which was a structure with blackout walls and window and roof bits that could open to let light in. The whole thing was on a turntable so he could position it in best position based on the sun’s position.  As such, anything that was to be filmed had to come to Edison.  And when done, the footage was put in a Kinetoscope, and was only viewable by one person at a time.

By Albert Tissandier - Originally published as an illustration to "Le Kinétoscope d'Edison" by Gaston Tissandier in La Nature: Revue des sciences et de leurs applications aux arts et à l'industrie, October 1894: Vingt-deuxième année, deuxième semestre : n. 1096 à 1121, pp. 325–326. Republished with "Mechanism of the Kineto-Phonograph" by Arthur E. Bostwick (Science editor) in The Literary Digest. v.X No.4 (24 November 1894), p. 15 (105). Image file uploaded from [1]., Public Domain, https://commons.wikimedia.org/w/index.php?curid=602117

Illustration of Kinetoscope, circa 1894.   Note there is a viewport on top for only one person at a time.  (Originally published as an illustration to “Le Kinétoscope d’Edison” by Gaston Tissandier in La Nature: Revue des sciences et de leurs applications aux arts et à l’industrie, October 1894)

Edison’s studio the “Black Maria” (note the curved bit in the ground, that was the turntable to turn the whole building to match the sun’s location.

 

 

 

 

 

 

 

 

 

 

 

 

1895: Lumière Cinematographe. About the same time Edison was doing this, The Lumière brothers were working on their own camera,the “Cinematographe.”  They patented it in 1895. Theirs also used 35mm but only one perf per image area, (that is, one on each side), and the perfs were rounded.  There is a small possibility that despite the different sprocket placement, it might work with modern film stocks, but just be a bit chattery and uncooperative, as the holes were a different shape, but they appear to be approximately same place relative to the width of the film.

Front view of the Lumière Cinematographe Note small 60′ feed mag on top.(photographed by Richard Edlund, ASC and collaborator Dave Inglish, ASC collection)

the side view of the Lumière Cinematographe   Note the brass top part used for holding already developed film for projecting. (photographed by Richard Edlund, ASC and collaborator Dave Inglish, ASC collection)

back of the Cinematographe. (photographed by Richard Edlund, ASC and collaborator Dave Inglish, ASC collection)

Nevertheless, the Lumière brothers made several important contributions.  Their little Cinematographe (essentially a wooden box with a lens and a crank) could shoot about 50-60′ of film, and was un-tethered to the requirements of DC power, as it was hand cranked.  Additionally, after the film was developed, the camera could then load the developed film and unexposed negative together and make a contact positive print.   Develop that and you had a print you could show audiences.  And coincidentally, get a big light source and shine it into the back of the little camera and viola! it was now a projector!  Suddenly, instead of only one person viewing at a time, as in Edison’s Kinetoscope, hang a sheet up in a venue and everyone in the room could view it.  (Discounts were given for those who had to view the image through the sheet as opposed to on it due to their less than ideal seats in the venue.)  Now rather than bringing the subjects to the camera, as Edison did, the camera came to the subjects.  Lumière cinematographers could come into town, film local events and screen it the same day in the evening at a local venue.

Despite it’s versatility, the Cinematographe was a pretty simple camera.  It had no viewfinder.  It was a wooden box with removable doors front and back and a smaller wooden box that sat on top that served at the feed magazine.  The crank was in the back. Take-up occurred inside the camera body.  Max load was about 60′.  The only way to frame something was to open up the back and either with a ground glass placed in the image plane, or by using the film itself, a dim, upside down and backwards image could be seen.   Focus using that, then close up the camera, and crank away, hopefully without shaking the camera too much or being uneven in your speed.  under these conditions, no panning was going to happen, as the operator would be just guessing what he was pointed at.

 

1902: Pathé Professionelle.  While this was going on, four French brothers, (Charles, Émile, Théophile and Jacques) tried to get in on the action.  Charles Pathé had seen the Kinetoscope, and presumably the Lumière’s work, and got patent rights for Eastman Kodak stock in Europe. by December 1897, Société Pathé Frères was formed, and they got heavily into production, lab work, and distribution of film.  Initially they used cameras derived from Lumière patents.

Pathé Professionelle ((photographed by Richard Edlund, ASC and collaborator Dave Inglish, ASC collection)

Pathé obtained the the rights to the Lumière patents and set about designing their own “studio” camera, expanding upon those patented designs.  by 1903 they had the “Pathé Studio” or “Pathé Professionnelle” camera.  (some data indicate 1907-1908, but this seems late)

back of Pathé (photographed by Richard Edlund, ASC and collaborator Dave Inglish, ASC collection)

by the 1910’s All major Hollywood studios were using Pathé Professional cameras.  They improved upon the Cinematographe design by having lots of extra features.  They had 400′ capacity magazines, an actual viewfinder, albeit parallax viewing, a footage counter, a focus adjustment knob in back as well as an iris control knob.  Later models had a fade up & fade down capability using that adjustable shutter.  (Film lab work was not that sophisticated, and the more effects you could do in the camera the better.)  They even had a single frame capability.   In addition to the parallax viewer, there was a peephole that was light tight to the film gate, and you could look through that between takes to precisely frame up your subject, using the film stock as a ground glass.  It was dark, and upside down and backwards, but it would show you what exactly what the camera was seeing, including focus.  (Unfortunately if you tried this with modern color stock, it will not work, as the anti-halation back makes the stock base too dense to see through.)

That said, the camera was not without its problems.  Despite all these improvements, most cameras only came with a 50mm lens, and the external focus knob was calibrated to that alone.  The body was made of leather covered wood, as were the magazines.  The non-conductive properties of wood combined with the fast moving celluloid nitrate film in dry environments could cause static discharges that would silently ruin takes, only to be discovered later when it was developed.  As this was a new industry, professionals formed informal groups to share information, troubleshoot and tell stories.  In California, that group was called the “Static Club,” presumably after their most vexing problem.   It is worth noting that in 1919 the “Static Club” (based in LA) joined with “The Cinema Camera Club” (In NY, formed by Edison) to form the American Society of Cinematographers (ASC) which is still very much active today.

One working solution to prevent static was to put a damp sponge inside the camera body to help alleviate the static buildup.  Perhaps due to the Pathé Professional’s heavy use in the 1910’s or the fact that wet sponges were being put in the wooden cabinets that were the camera bodies, the cameras developed a reputation for always needing repair & additional light-proofing with electrical tape, presumably as the wood joinery started to come apart.  It didn’t help the the body was called a “crackerbox” due to either it’s shape or lack of durability.

If you are interested in more, check out the ASC post about the camera owned by the cinemeatographer who photographed the “Perils of Pauline”: Arthur C. Miller, ASC’s Pathé donated to the ASC.

 

1908: Debrie Parvo. The beginning of the evolution away from the Pathé Professional as the gold standard for working cameras.  André Debrie, previously a manufacturer of film perforation machines in France, finished working of the Debrie Parvo camera.

an early Debrie Parvo made of wood. Note one of the round 400′ mags that almost look like a film can.

It was his attempt to make a more portable, compact, versatile camera. The design was compact wooden box with internal 400′ metal magazines.  The Parvo also had the crank mounted on the side, instead of the back.  It was an improvement from the Pathé Professional, in that it was more compact and had a better viewfinder system. You could look through the viewfinder in the back, and look at the image form the lens to determine framing and critical focus.  But for framing while cranking, you would have to use the side parallax finder, or have your eye pressed firmly against the very very dim image in the eyepiece.  But at least showed the exact frame as it was being exposed, as you were looking through the film as it went through the camera, while rolling.  of course any light leaks from the eyepiece would ruin the film.  and now, modern film stocks with their remjet backing are too dense to view through via this method.

Also it was not uncommon to have a selection of wide angle and telephoto lenses for the Parvo, as focus was done “through the lens” there was not a single calibrated scale on the side dialed in for only one lens size, thus making it less cumbersome to change lenses.

By Fletcher6 - Own work, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=22433099

a later Debrie Parvo, possibly an “L” model, made of metal.

There were several Parvo models over the years, and by the 1920’s they switched to metal for the body, as many other manufacturers had, and had various other improvements as well.  One was an ingenious method of viewing through the taking lens by way of a sort of swing-away gate where the gate with the loaded film could be pivoted away and swing in an identical “gate” that consisted of a ground glass.  This could be done without opening the camera or molesting the loaded film. This meant that, at least between takes and even when using more modern opaque film stocks, you could look through the lens to check focus and framing.

Another addition was an optional DC motor.  This and other innovations kept Debrie making them well past the silent era as an excellent MOS camera.

Frame from “Man With A Movie Camera” 1929 directed by Dziga Vertov. cameraman operating a Parvo camera from a precarious position, being himself filmed by another cameraman, presumably from a similar precarious position, while both are underway. Don’t try this at home.

It was the first European camera that was noticeably better than the Pathé, and as such was adopted by such filmmakers as Sergei Eisenstein, Dziga Vertov and Leni Riefensthal.

 

1909:  A formal standard for 35mm motion picture film. Edison formalized the standard of 35mm motion picture film.  He formed a trust, The Motion Picture Patents Company, which agreed in 1909 to what would become the standard: 35 mm gauge, with Edison perforations and a 1.33:1 (4:3) aspect ratio.  The only difference in the Lumière standard was the perfs.  It is worth noting that before this, most people bought the film un-perforated, and perforated it themselves to whatever standard they needed for their camera.

It’s worth noting that Edison, while being a prolific inventor and a businessman, was a bit of an asshole.  After forming the Motion Picture Patents Trust, he felt that meant anyone using a camera with 35mm film owed him some cash.  He tried various ways to enforce this on the east coast, even resorting to thugs to disrupt independent filmmakers and even smash their cameras.  At least one filmmaker in Philadelphia resorted to sending out “decoy” crews to distract the thugs while the real crew worked unmolested.  Rumor has it that this was one motivation for Hollywood becoming a location for film making, as Edison’s east coast goons were far far away.  But that’s another story.

In the adoption of standardization,Donald Bell and Albert Howell came out the winners in the film perforation business.  Bell had been a movie projectionist, and Howell was a machinist and together they initially got in the business of repairing and improving cinema equipment.  They designed and manufactured much of the perforators that many, including George Eastman, used to perforate film from 1909 on.

1912: The Bell and Howell Standard Cinematograph 2709 camera.  Just as the Debrie Parvo was the European improvement on the venerable Pathé, the Bell and Howell 2709 was the American answer for a better camera.

Bell and Howell 2709 camera. Note on left image both a hand crank and electric motor are installed.  Also, it appears that the feed part of the mag is loaded with emulsion-out film, which is not how modern camera film stocks come.  But since the feed side is freewheeling you could load emulsion-in film stock, just mounted “9” vs “p” in the mag.  In both cases though the take-up is emulsion-out as it is driven by a pulley.  (Also see in detailed photos from Adam Wilt on a modern production from 2012 that used a 2709 that Art Adams  (acting as AC) loaded it emulsion-in no problem.)  Above photo from Chicagology.

Initially Bell and Howell made cameras the way most others were made, of wood.  But after one of their cameras suffered mildew and termite damage on an African safari they decide to go with a cast aluminum body.  This added durability, reduced static discharge, although it added cost.  This new camera also had several other vast improvements.  It had a 4 lens turret, a rack-over system for better framing and focusing for those four lenses, and registration pins to better hold the film still while it was exposed.  The hand crank was on the side.  They named it the It the “Standard Cinematograph Type 2709.”  It was vastly superior to the Pathé Professional, but it also was very expensive. it cost over four times the cost of a Pathé.  Initially only movie studios could afford to buy them.  It took a while for their popularity to take off, but by 1919 all major studios owned them.  Even Charlie Chaplin bought one, in 1918, for about $2,000, which is about $32,000 in today’s money.   To this day, when people do the “universal sign language” of movie-making by peering with one eye and making a cranking motion with their right hand by their head, or drawing a silhouette of a camera with “mickey mouse ears” magazine on top, they are mimicking a 2709.

Charlie Chaplin with a Bell and Howell 2709 circa 1925

The 2709 had an advanced movement that had registration pins as well as pull-down claws, which made for a rock steady transport of the film.  But the camera was not perfect.  The Bell & Howell finder still showed images upside-down.  As a consequence, many replaced it by the 1920’s with a Mitchell finder that righted the image.  Same went for the Bell &Howell Matte box, as well as tripods, according to Richard Edlund, ASC.  Even after silent films were no more, Bell and Howell 2709’s managed to survive as MOS or title sequence cameras well into the sound era.

If you want even more info on the 2709 operation, head over to Adam Wilt’s experience working with a 2709 in 2012.

1914-1918 “The Great war” /World War I.  Initially a European war, it eventually directly involved the United States in 1917.  This mattered in film-making because the Pathé and Debrie were French design, and during the war, getting additional supplies from overseas was difficult, incentivizing US manufacturers to make their own models stateside.

 

Akeley camera.  Photo from ReelChicago.com 

Akeley #265 from the side. from samdodge.com

Akeley opened up. inside is a rather conventional looking 200′ magazine. photo from samdodge.com

1915 Akeley “Pancake” Motion Picture Camera: Carl Akeley was not a professional filmmaker.  He was actually a taxidermist by trade.  But that is kind of like saying Indiana Jones was a college professor.  When Akeley felt something could be better, he more often than not he ended up revolutionizing what ever he tried to improve.  Before Akeley, taxidermy mainly consisted of stuffing a skin with sawdust and sewing it up, often by people who had never seen the animal alive.  This seemed foolhardy to Akeley, and in 1896 (as the Field Museum’s Chief Taxidermist) took his first of 5 safaris to Africa to collect specimens and to see the animals alive and in the wild.  His work and approach revolutionized Taxidermy.  In 1909, in order to better study lions, he brought a British Urban camera (Urban Bioscope/ Charles Urban Trading Co.) and although the details are sparse, it was probably a wooden box affair, and like many cameras of that era, not very easy to use.  In any case, the native hunters cornered and killed the lion before Akeley could get his camera pointed, leveled, focused and framed properly.   He swore he could do better and that he would design a “naturalist’s camera” that would fare better against fast moving action under difficult circumstances.

In 1911 he formed the Akeley Camera Company.  by 1915 he patented the “Akeley Motion Picture Camera.”It was unlike any other camera of it’s time.  The tripod head, which on every other camera was a separate part, was integral to the design of the camera body.  the “pancake” round design was both the camera and the head.  This meant that with a simple pan handle on the back you could drive the camera position in what was almost a nodal head.  The viewfinder was articulated and, while still a parallax finder, it had matching lenses to the taking lens, it showed the image right side up, and when you adjusted the focus on the viewfinder lens, gears adjusted the taking lens focus.  the viewfinder also could remain stationary while the camera was tilted, a huge improvement if you were following action. The lens pairs were very quickly interchangeable, and kits often included telephoto lenses, due to the nature of what the cameras were asked to film.  He appears to have been the first one to invent the ball leveling head as well, which as any cameraman knows is essential to quickly leveling a camera in uneven terrain. (Next time you level your ball head, thank a taxidermist!)  The shutter on the camera was 230 degrees rather than the standard 180, which let more light in (good in challenging lighting conditions), and the shutter itself was an innovative spinning cloth arrangement that traveled the inside of the round drum of the camera body.   Like the Bell and Howell 2709, it’s body was all metal.

An Akeley 200′ magazine. serial #203E to be exact. Akeley cameras are rare and expensive, but apparently their magazines are not. This one I got off Ebay for pretty cheap. That’s just a dummy bit of film. Central roller has sprockets on it, so loop size is set in the bag.

The only conventional element in appearance was the magazine, which held 200′ and went inside the round drum of the body.   The loop was pre-set in the mag, so re-loading was fairly quick in the field, provided you had a spare loaded mag waiting.   It truly was a camera for quick moving action, as it’s designer intended.The camera quickly became adopted in Hollywood a specialty camera  filming action sequences.  It would be called for specifically in shooting scripts (“Akeley shot”), and directors would say “Get me an Akeley man!” if he had an action sequence that needed filming.  It also was quite popular with documentarians.  Robert Flaharty used two Akeley cameras when filming “Nanook of the North.” Akleleys were also used in “Wings” as well as chariot sequences of “Ben Hur” to name a few examples.

Akeley with his camera, Photo: American Museum of Natural History , photo 260071

Not only did the Akeley camera make art, sometimes it was the subject of art itself.  The machining and build quality of the camera was such that Paul Strand, days after buying one, took stills of various parts of the camera and those photos are now considered art and are part of the Metropolitan Museum of Art collection.  An original print of one of the interior photos can go for $40,000.

The Field Museum, as of this writing, has an exhibit of the camera itself on display which runs until March 2019.

The Akeley camera was used during the Field Museum’s 1928–29 Crane Pacific expedition.  Note the ball level put to good use on uneven terrain.  Also integrated head/camera design.  Photo from the Field Museum.

If you want detailed info on the operation of an Akeley Pancake camera, check out Sam Dodge’s detailed walk through an Akeley.

Carl Akeley:1 Leopard:0. Circa 1896

Carl Akeley’s life seems fantastical at times.  He was the “Father of Modern Taxidermy.” He invented the most innovative action camera of its time, and even invented spray-able concrete after seeing the facade of one of the museums he worked for falling in disrepair.  He killed a leopard with his bare hands (partially because he was a bad shot) survived getting trampled and left for dead by an elephant, hung out with Teddy Roosevelt in Africa, and his wife left him because of a monkey. He was a big game hunter, but also a conservationist.  He is responsible for the biggest gorilla preserve in Africa.  He even wrote a book about some of his adventures:“In Brightest Africa.”  He died in 1926 in the Democractic Republic of Congo of a hemorrhagic fever, shortly after taking George Eastman on safari.

Modern photo of Wilart. Photo Nate Clapp Circa 2014

1919: The WIlart (Whew!) The Wilart Instrument Company in New Rochelle NY started making what was essentially a clone of the Pathé Professional, except the body was made of metal, like the Bell and Howell 2709, and pretty much all subsequent camera models after the 2709.  Pictures of the interior film transport/ gate area are indistinguishable from the Pathe.

Presumably the metal body eliminated any static discharge, but about this time film stocks were adding an anti-static backing as an option which also alleviated this problem.  But this backing was too dense to use the peephole option for framing through the back of the film between takes as many early cameras relied on (like Pathe & Wilart.)

 

An early version of Wilart. Note the parallax viewfinder integrated with the body. Also either iris  indicator in on front panel.

It is unclear to me if The Wilart Instrument Company licensed the Pathé design or just out-and-out stole it.  Curiously, they did not reference the similarities to the Pathé in advertising, which suggests they perhaps didn’t ask permission to copy it.

In any case, The Wilart hoped to cash in on an affordable American made camera in the post-war boom.  Granted, it’s technology was from 1903-1907 (evolution of the Pathe features) and as such,  was 15 years old, but was an affordable, proven design, but now in metal.  It ended up a camera used more in industrials and 2nd tier productions, as by this point the cutting-edge cameras in Hollywood were the Bell and Howell 2709 and the Akeley Pancake camera.  Nevertheless Wilart Instrument Company seemed to achieve success, working on further designs and even planning a large film storage facility in Baltimore.  But by 1926 the “talkies” came and the need for additional hand cranked cameras fell through the floor.  The Wilart company seems not to have weathered this storm and disappeared without a trace.

Pathé insides. Notice a similarity? Photo: ASC collection

Threading on Wilart from Wilart Manual

Ok! that was the evolution of how we got to the Wilart.  Next let’s learn how to load it.

“How to Load and Operate a Wilart Part 2: Loading” coming shortly……

 

24 fps: Where Does It Come From?

Back in the day (turn of the last century) there was no such thing as camera batteries or sound men. Men were men and cameras were hand cranked. As they had evolved from still cameras they were sort of still camera Gatling guns, capturing still frames as fast as you cared to crank. Somewhere past 14fps something magical happened and persistence of vision started to fuse the images so rather than a fast slide show it started to look like motion. So cameras were built to move one linear foot of film per two cranks, which meant if you cranked at “coffee grinder” speed you hit 60 feet per minute, which comes out to 16fps, just north of that 14fps effect. Cranking faster improved the persistence of vision thing, but producers didn’t like you blowing through all that expensive film, and besides there was really only one stock available and it was slow, about 24asa. Cranking faster meant less light per frame. Sometimes you cranked even less than 14fps to squeeze a bit more exposure out of it.

This is where it gets a little weird. Projectionists, who were also often hand cranking their projectors had a habit of cranking faster. Faster meant faster turnaround in seating, which meant more $ and even better persistence of vision without annoying flicker. Sure, action was sped up, but the whole thing was new, no one seemed to complain. In fact, by 1925 The Society Of Cinema Engineers (now known as SMPTE) had codified it recommending

60 feet per minute (16fps) for camera speeds and projecting at 80 feet per minute (21.3fps) seems weird now to pick a different speed for display from capture, but to review, faster cameras cost more money, and faster projectors made money, and after all, producers are paying for everything.

proposed standard cranking and projection speed circa 1927 from SMPE

proposed standard cranking and projection speed circa 1927 from SMPE

Anyway, someone decided it would be a great idea to add sound. How hard could it be? In fact, several companies tried to be the first to bring sound to the movies, hoping to capture the market. Funny thing is they all insisted on capturing at the same frame rate they displayed at. If you didn’t, the pitch would be all wrong and everybody would sound silly. And forget about music. Some picked 80 feet per minute (the already established speed for projection), some picked 85 feet per minute, and some picked 90 feet per minute. First one to get a working system was Warner Brothers Vitaphone. It was used in the 1927 “The Jazz Singer” which was the first feature length film with sync dialog and is considered the official start of the “Talkies.”

IMG_0582

Western Electric’s Bell Telephone Laboratories (and their Vitaphone system) as well as other systems listed taking speed and projection speed (SMPTE 1927)

 

The Vitaphone engineers had picked 90 feet per minute, or 24fps as their capture and projection speed. If one of the others had been first, we easily could be shooting 21.33fps or 22.66fp as a standard today. So sometimes you get lucky.
Except the Vitaphone system was terrible. It sounded good but that’s all that could be said about it. The sound was recorded on 16″ disk records separate from the film. They could only be played 20-30 times before they were no good, and they could break, so you had to send lots of duplicate disks with each roll of film to the projectionist. A disk only covered one reel so every reel change you at to cue up another record. And synchronizing the needle with the head of the roll was a pain in the ass. And if you broke the film for some reason and spliced it back, everything past that point was out of sync. During recording, the camera had to be motor powered from the mains, and the disks had to be made in recording booth adjacent to the set. In fact it was such a bad system that it was abandoned 5 years after it was implemented. And it only lasted that long because all the theaters that wanted to have sound had bought into that technology and had these crazy phonograph contraptions connected to their projectors and weren’t eager in throwing them away just after having bought them. Movietone, which used technology that put the audio as an optical track on the film had many advantages, but it was a little late out of the gate. Because Vitaphone was first, the engineers of Movietone decided to match the Vitaphone frame rate.

“Originally we recorded at a film speed of 85 feet per minute. After Affiliation with the Western Electric Company, this was changed to 90 feet per minute in order to use the controlled motors already worked out and used in the Vitaphone system.  There are a large number of both Vitaphone and Movietone installations scheduled and in operation, and sufficient apparatus is involved to make it impractical to change the present practice of sound reproducing.  In connection with the Society’s standard, I have been unable to find any New York theater which is running film at 85 feet a minute; the present normal speed is 105 feet and on Sundays often 120 feet per minute is used in order to get in an extra show”

Earl L Sponable, Technical Director, Fox-Case Corporation, New York City (“Some Techincal Apects of the Movietone” S.M.P.E. #31 September 1927, Page 458)

Soon enough Movietone lost ground as well, as technology changed but all subsequent sound systems stuck with the now established 24fps. So blame a sound man. Or thank him. Your choice.

Vitaphone

One of the first sound men checking a Vitaphone recording with a microscope while recording. Sort of a human playback head. (page 308 from Transactions of S.M.P.E. August 1927)  It turns out this man is George Groves.

 

Postscript: Now of course, we often mean 23.976 fps when we say 24 fps.  This one we can’t blame on sound.  23.976 fps as a camera frame rate can be blamed on the introduction of color to standard-def television broadcasts in the 1950’s, and the death of film as a capture medium, and by extension the death of telecine as a post process.

When TV started, it did not match the 24 fps established by film.  This is because engineers wanted to use the 60Hz cycle from our 110v 60Hz household power to drive frame rate.  60Hz meant 60 fields, or 30 frames per second, and was pretty easy to implement.  Once color came along in the 1950’s they wanted a standard that would be backwards compatible with black and white TVs.  Engineers could no longer use the 60hz rate of the household electricity to drive frame rates and keep the color and luminance signals to play nice so they settled on a very close one of 59.94hz.  This resulted in a frame rate of 29.97 fps, from the previous 30 fps, something the black and white receivers would still work with.

Telecine: in order to get film onto TV you had to do a step called telecine.  the film was played back and captured essentially by a video camera.  Getting 24 fps to fit into 30 fps was done via a clever math solution by what is called 3:2 pulldown.   There are two fields to a standard-def frame, and thus 60 fields per second, and 3:2 pulldown would use one film frame to make three fields (1.5 frames) of video.  Then the second film frame made two fields (1 frame) of video, and the third frame made 3 fields again and so on.  Doing this, 24 fps fits quite nicely into 30 fps broadcast.  and anything shot 24 fps but shown at 29.97 fps system would look like it had been shot at 23.976 fps, even though the camera had been running at 24 fps, as anything that ran through the telecine went through a 01% slowdown to conform to the 29.97fps broadcast standard.  Somewhere in the transition to High Definition 23.976 became codified as a standard, not only for broadcast, but a capture speed. As cameras more and more were digital and not film, they would choose 23.976 as the actual camera frame rate, rather than 24fps and expect the 0.1% slowdown to happen upon transfer from film to video, as had happened to film in telecine rooms.  No telecine? no slowdown, which meant it had to be implemented in actual camera speed.

So, hate 23.976 fps? blame a Sound Man, color TV, the death of film and the whole accidental way we pick our standards.

 

For those interested in reading more, I highly recommend reading online records of the Journal of Society of Motion Picture Engineers, made available by the Media History Project. http://mediahistoryproject.org/technical/

 

 

Why I hate UAV copters

Drones. UAVs. Octocopters. Call them what you want. They are the new disruptive technology in a lot of applications, but I am specifically going to talk about them as they apply to my industry, as a camera platform for dramatic, narrative, or commercial work. You can see the allure- it lets you get shots that otherwise would be difficult or in some cases impossible via traditional methods. And camera movement is the best way to add production value to your shoot.
And I hate them. I hate them like I hate steadicam. What’s that you say? Hate steadicam? What kind of Luddite or backwards filmmaker are you? Let me explain. I do not hate the steadicam device per se, and I completely agree that steadicam allows for shots that could not be obtained any other way. I might even be convinced to use one some day. Here is what I hate about steadicam. People act like it is the solution to everything and will make everything awesome. A Steadicam is not awesome sauce you get to spread all over your shoot. It has weaknesses, just like any camera platform. Let’s review. A Steadicam can not provide a stable horizon on a static shot,
especially after it has been moving, which is why you use Steadicam in the first place. There are operators that can mitigate this, but it is inherently difficult on this system, yet directors insist on designing shots completely blind to the weaknesses of the platform and Steadicam operators struggle to make the shot work.
Another misconception is that steadicam systems are fast. Tracks don’t need to be laid, it has the freedom of handheld, you can just go. The fact is often quite the opposite. Steadicam can cause the shoot to slow way down. First of all, the whole system of a steadicam requires that the rig be balanced. This means a lens change, an addition of a filter, adding a timecode box, all require time out to balance the rig. If you are dealing with a shoot with only one camera body, to go from tripod to Steadicam can be a very involved process and ties up the camera during that process.
Once you have the camera balanced and on the rig, another thing to keep in mind is that a Steadicam rig with a camera on it is quite heavy. Between the camera, the post, the counterweight, wireless transmitters, arm and vest, it can tax the best operator. This means the operator needs to park it on a stand or docking station when not actually executing the shot. This makes blocking and lighting the shot a bit more difficult as it is best done while the operator is wearing the sled, which you want to keep to a minimum to keep him or her fresh.
Also, as the camera has potentially 360 degree movement, lighting can be a challenge. Nowhere is safe, and lights need to either rig directly in the ceiling if possible, be hidden somehow, or travel with the camera. Again, all this can be done, but none of it in the category of “fast.”
So, lets review: I hate steadicam because people think it is secret sauce to make their shoot better but are completely ignorant of its weaknesses. There is one other thing I don’t like about steadicam, and that occurs even when people understand its weaknesses, and that is the urge to do the “trick shot” which is an exercise in “look what I can do” rather than filmmaking to drive the story. Sometimes you can do a trick shot and it move the story at the same time, and people like me can enjoy both aspects of it. But showing off you know how to use a tool doesn’t mean you have made a great story.
Flying camera platforms may not be Steadicams but they might as well be. They do and will give you shots that otherwise would have been at the very least difficult, or possibly impossible before. And those shots have the potential to be amazing. And just like Steadicams, people will misunderstand and assume that as long as you use a drone the shot will then therefore automatically be amazing. Misunderstanding the tool you are using will result in wasted time, frustrated crew, and mediocre filmmaking, just as it alway has. But with drones there are two new aspects. One, is something that is generally happening with all gear in the industry, something that optimistically is called the “democratization of filmmaking” but on a practical terms means that good working gear can be purchased for prices more approaching a car than a house. There is good and bad with this, but one side effect is there are a lot more players in the market. Generally this shakes out as those who have skill, or have the potential to learn skills adeptly end up on top, as it has always been, but without money being a gatekeeper that it once was, which means the entry level of the market is crowded, like the beginning of a marathon.
Drones, especially seem to fit this category. A few years ago the technology just wasn’t even there to make a working drone at any cost. Now parts and information is out there that a professional rig can be built from parts ordered online at a very reasonable cost. In fact turnkey solutions even exist under $800. Back in the 90’s a Steadicam probably cost upwards of $40,000-60,000, and that didn’t include the camera, just the platform. So, a lot more people are getting into drones than ever were into Steadicam. Drones are so new that there are no “old hands” at it. Everyone is at the start of the marathon and it’s crowded.
The other new aspect to drones as a camera platform is the safety issue. This is what really makes me dislike them. Back in the day, a careless Steadicam operator could possibly hurt themselves, damage their rig and the camera, and possibly the nearest person, be that an assistant or actor, although this was quite rare. I know of no stories of this happening directly, although I always think of the emergency ripcord on the Steadicam vests of guys I would assist for, which when pulled would cause the vest to split open and fall away, allowing the operator to shed the rig in seconds in case of a catastrophic event like falling into a large body of water with 80lbs of gear strapped to them. Again, I never heard of anyone having to exercise that option, but it was there.
Drones, on the other hand often are 20-40 pounds of flying danger, often with eight very high speed sharp rotors being driven by high energy high capacity lightweight batteries, all being controlled by wireless control. Often built from scratch by the operator. Some of them have fail safes, where if wireless control is lost, they will return to original launch site and descend. That’s great, but only if those automated systems are solid. Again, many of these things are being built from scratch, and the code being written or at least tweaked by the builder. If the drone loses flight stability be it from a large gust of wind, operator error, or hardware or software malfunction, you have a potentially lethal falling object that can kill you and others by either just plain blunt trauma 20lbs falling on your head, or cutting you open with its eight high velocity Ginsu knives it uses to fly, or burn you when one of the high capacity batteries rupture and spew a jet of flames and energy. Look on YouTube and you will find several UAV/ drone failures, often triggered by a gust of wind, and possibly complicated by navigational hazards like nearby buildings the drone can hit on its way down so that its structural integrity is compromised well before it hits you. Now imagine that the price of entry is so low, people with only a passing interest get into it. Before you know it the sky is dark with flying lawn mowers being driven by mediocre do it yourselfers, who think they have the secret sauce to awesome filmmaking.

This is an evolving topic, and the good news is that there has been some attempt to regulate them in a way I approve. Up until recently there was a big question mark on whether all kinds of drones were illegal, and where the FAA stood on it. It was like the Wild West. It seemed like before the rules got codified, it was “anything goes” approach which seems very dangerous to me.
Making them illegal seemed untenable. They were so cheap and offered the allure to so many people, enforcement seemed almost impossible. Also if they are illegal, there would be no regulatory control on them. Just this month the FAA has been authorizing individual companies to be certified for flight, excepting them from normally required regulations as long as they fit a certain category of flight, including flying only over a “sterile” environment, i.e. the controlled set. Licenses, permits and special rules are the way to go. And prosecution of those who refuse to play by the rules. Individual drone operators need to apply for “certification” in order to be legal. This is because the technology is cheap, readily available, and dangerous.
Drone camera platforms need to be safe, legal, and somewhat rare. I don’t hate drones, as much as I hate the idea of people flying homemade unregulated rigs over my head because that will somehow make the shot “cool.” By making them sensibly regulated they then will (in most cases) be operated by sensible, trained operators, and only when they are the appropriate tool for the job.

P.S. don’t get me started on Movis or other gimbal handheld systems.

How to Quiet a Noisy Dragon

My last post about my quick test with the new Dragon sensor had a bit of a surprise with the Dragon footage looking noisy, especially compared to the previous non-Dragon MX chip.  RED suggested that a fix was on the way, and soon.  They hinted at around a week.  That was June 21st.  My rule of thumb for RED target dates for delivering a product is: take the stated time period, double it, and add two months.  So, in this case that would mean somewhere around the first week of September. Well, RED beat the odds, while still completely missing their target of a week, and August 6, 2014, a mere 7 weeks, they released the fix.   The fix is a different way to debayer the RAW footage, selectable in their new beta release of REDCINE. The feature is called “DEB” or “Dragon Enhanced Blacks” although it could easily be called the “Anti Red Speckle Filter” as it gets rid of the red noise. it is a checkbox you select right below the Gamma settings in REDCINE.

DEB checkbox just below Gamma Settings in REDCINE

DEB checkbox just below Gamma Settings in REDCINE.  Here it is not selected, the current default.

The good news is it is retroactive and can be applied to footage you have already shot.  This is great for my purposes as I don’t have t re-do the test.  Here are some quick screen grabs.

same shot from prior test, with DEB allied and not.

same shot from prior test, with DEB applied and not.

DEB dragon Vs Epic-X

RED Epic MX on left, Lower right is Dragon with DEB applied.

So it definitely improves things. Further testing is warranted though. I hope to test it against a different camera, like Arri Alexa or Amira in the future.  RED is talking about making user swappable OPLFs but no time frame on that yet.  Once they do put a date on it, don’t forget to double it and add two months!

Epic MX vs Epic Dragon.

I just recently got my RED Epic back after RED installed the new “Dragon” chip.  I borrowed an Epic that still had the MX chip and shot side by side tests to see how much better the Dragon chip actually was.  I found a few surprises.

First off, I updated both cameras to the current release builds 5.1.51 and black shaded both of them after cameras had reached operating temp at adaptive at 65c.

IMG_4748the weather was a bit unsettled (as it always seems to be when I seem to have time to do these tests) so I decided to put on a 35mm Red Pro prime on the Epic MX and a 50mm RPP on the Dragon and move the cameras a bit to mimic same frame size of midground so I could roll both at the same time so exposure would be identical.  I set them for the same stop, which if I recall was something like f 8 1/3.  according to the false color overlay the Dragon had more info in the highlights before clipping.  In fact a small patch of white sky which clipped on the MX was apparently not clipping in the Dragon. I shot myself in my garage workspace that has diffused top light and a nice window for testing overexposure.  If I shoot in the afternoon there is a piece of an apartment building across the street that gets hit with sun, providing an excellent detail/ overexposure test element.  People have complained in the past about RED’s rendition of skin tones.  I think this is wildly overblown.  I have never had an issue with it, at least under daylight color temps, which is how I shoot the majority of my stuff.  The Dragon is supposed to be much better.  I did not notice much of a difference, nor did I see any problems with the old one.  One note is that with the new chip comes some new settings in REDCine, the app that lets you “develop” the Raw footage from the camera.  before Dragon there was “Redcolor 3” for color rendition, and “REDgamma 3” for gamma.  With Dragon there now is “Dragon Color” and “REDgamma 4.”  while I think you should use Dragoncolor for rendition off a dragon chip, don’t be fooled into thinking REDgamma 4 is automatically better.  It tends to be a bit crunchier than REDgamma 3 which under normal circumstances helps make a punchier image, but when testing over under exposure range, may fool you into thinking the Dragon chip has even less range than the Regular Epic.  Of course, professional graders probably would just use REDlogfilm (which is flat and holds onto the most range) and make their own curve based on the scene. but in this case I wanted to do a relatively unbiased side by side test.

screenshot from REDcine.

screenshot from REDcine.

Below are some interesting screen grabs:

COLOR RENDITION:

Side by Side Dragoncolor Vs REDcolor3

Side by Side Dragoncolor Vs REDcolor3

Dragon on the right, Epic-MX on left. Dragon set to Dragoncolor Epic MX set to REDcolor3  More detail in Dragon, and slightly less red tint to the skin.

HIGHLIGHTS:

Now here is a detail of the highlights: Dragon on the right, Epic on the left.  Both set to REDgamma3. slightly more highlight detail on the Dragon chip as expected.

RG3 Dragon vs MX
Don’t be fooled into using REDgamma 4 just because it is better.DRAGON RG4 VS EPIC RG3 Here is the same footage

but Dragon chip is rendered at REDgamma4.  It is very hard to tell which chip has the advantage in this scenario.brick looks about the same, maybe slight advantage Dragon, but leaves look hotter on the Dragon.

Now here is the same thing in REDlogfilm. You will see the Epic MX chip clips magenta showing the white is clipping.  Not so in the Dragon. as you might expect.

Epic-MX vs Dragon in REDlogfilm

Epic-MX vs Dragon in REDlogfilm

BIG SURPRISE: NOISE LEVEL:

This is Epic MX on left and Dragon on the right

This is Epic MX on left and Dragon on the right. both at 5600K, 800 ISO, 8:1 compression.  Dragon at 6KHD and Epic-MX at 5KHD.

 

This was the big surprise. The Dragon Epic was sharper, as was to be expected as it’s max resolution was better than the MX (6KHD in this case vs 5KHD) but there was more noise and pattern in the Dragon footage.  This was alarming as a year ago the Dragon chip had been advertized as 2000 ISO was as quiet as 800 ISO in the MX chip. What the hell? I paid over 8 grand to get MORE noise?

I went over to REDUSER.net and fund someone else had done essentially the same test as I had done and got the same results. if you are into flogging yourself, here is the link:

http://www.reduser.net/forum/showthread.php?117701-DRAGON-vs-EPIC-MX-Noise-NOT-GOOD

CONCLUSION:

Anyway the long and short of it was this: It is the new OPLF, along with black shading that is to blame.  When Dragons first started shipping to anybody (maybe December 2013?) they had what I will call version 1 of the OPLF (Optical Low Pass Filter) which is essentially a piece of custom class/filter in front of the sensor proper to improve performance, reject IR and limit moire.  All cameras have some kind of OPLF.  after logging some hours with it in the hands of users the following characteristics were found.  It did in fact seem pretty clean at 2000 ISO.  But there were weird magenta flares, which were very visible in low light situations.  A new OPLF was designed that got rid of the magenta flare, improved highlight performance by at least a stop (I have no idea how they did that, possibly in conjunction with new black shading algorithm?) and even better IR performance.  Downside? now 2000ISO is noisy.  And the whole thing blew up on REDUSER once everyone was getting new Dragons in numbers with the new OPLF.

The good news is that RED will, if you wish, put the old OPLF back in your camera, if that is what you wish.  And they claim to have a firmware build in the works (Don’t they always)  that will tweak black shading implementation that will address this problem, and that it will come out soon, possibly in a week.  But knowing RED that could mean a month.  They claim that the new OPLF brings so much to the table that they think the future is in the new OPLF with software tweaks. There even has been talk of user replaceable OPLFs in the future. I can confirm that the IR contamination on this new arrangement is quite good.  I had a shoot with an n1.2 and pola (6 stops of ND essentially) and a black dress Marine uniform in full sun and the uniform stayed black.  unfortunately I had shot it at 2000 ISO because I had believed that 2000 was the new 800, and that shooting at a higher ISO protected your highlights more. This had been true of the RED One, but is no longer true if RED Dragon.

Also, I rendered both the Dragon and the MX footage to a 1080 ProRes file.  the noise difference was indiscernible, which was a pleasant surprise.  I did not use any special noise reducing options on either.  So for the moment I am content to wait for the new firmware and black shading.  Even if it makes all the testing above obsolete.  But as always, test test test.  Sometimes you find a surprise.

 

ProRes Camera Test: BMPCC, F3, F5, and RED Epic MX

Recently I decided to test some of the cameras I often use and or have access to.  This included the Blackmagic Pocket Cinema Camera, a Sony F3, a Sony F5, A RED Epic MX.  Obviously these cameras record natively in different codecs and resolutions, but I decided to even the playing field somewhat by using an external recorder (A Convergent Design Odyssey 7Q) so that everything is recorded in ProRes HQ, except for the Pocket Cinema Camera, which both records ProRes HQ natively, and also only outputs via HDMI (Highly Dodgy Media Interface) which I hate and prefer not to use. In all the tests I used the same lens, a RED 18-50mm short zoom, which was PL mount so it fit on all the cameras, which all had PL mounts on.  Only for the Pocket Cinema camera did that significantly change the filed of view, as it is more of a super 16mm sensor size compared to 35mm sensor size of the others, which I compensated by zooming out to approximate the same field of view.  The RED Epic of could natively can record up to 5K, but in this case I was just recording the 1080 output from a 4KHD recording, as 4KHD most closely matches 35mm format size.  Also I did not record LOG on the F3 or F5, or output REDLog on the Epic.  I did do both “Film” Rec and “Video” modes in the Pocket Cinema camera, but I did not run them through Resolve and color correct them as I might if I were actually using the camera.  I wanted to look at the footage without doing any alterations other than what I might do in camera.  For the F3 and F5 which have scene files that affect the look of the camera, I arbitrarily picked Abel Cine’s JR45Cine scene file for both, anticipating that those would be the scene files I would use were I to shoot with these cameras.  For the Epic, I usually shoot with saturation set to 1.2 rather than the default 1.0, and that is what I did here. White balance was preset 5600 for all cameras. it was a somewhat unsettled day weather-wise, so take any ability to hold detail outside the window with a grain of salt as these shots were not all done simultaneously, as I was using one lens for all the cameras, and I had only two tripods in play.

Click on any below for full frame.

No surprises here.  Epic and F5 look the best.  the F5 skin tones look a bit too red, but the whites look a little truer I think. the Black Magic Pocket Cinema Camera looks like it needs to be graded some whether you shoot video or film mode, and therefore makes me think if I am going to have to grade it anyway to make it look its best, I would shoot in “Film” mode in the future and take it through Resolve.

Below are enlargements of the focus target. again, no surprises.  Sony F5 and RED Epic are the best for resolving and lack of moire.  I was surprised slightly at the F3 moire, and not surprised but slightly disappointed on the Black Magic Pocket Cinema Camera’s sharpness performance. Again, this is off a screen grab from the ProRes HQ 1080 recording.

If I were to pursue this test further, I would explore scene files in the F3 and F5 to see if I could get truer skin tones out of it while maintaining their ranges.  The RED I might play with white balance a little bit, maybe a little lower to get rid of some of the tones that seem a bit too warm.  My next test I had planned though was to test my RED Epic after it had it’s sensor upgraded to the “Dragon” sensor.

Dispatches from the field; or how I baked my MBP motherboard in a ship’s galley.

Sometimes I work as a DIT (Digital Image Technician) and a Camera Assistant.  in the summer of 2012 I worked on a Discovery Shark Week commercial in the Bahamas.

red at shark week

Sounds like fun, right?  Anyone who has done anything like this knows the clear eyed view of this would be realizing you would be taking a bunch of sensitive electronics near large bodies of corrosive salt water, full of bloody chum and hungry sharks, off the coast of a foreign country, with little or no support or even cell service.  It easily could get ugly.

On top of it my Mac Book Pro had started to act up about a week before departure.  It was intermittent, and I was not ready to pull the trigger on buying a new machine for the job.  there were going to be several other laptops on the job which I could press into service, so I would not be completely SOL if it failed.  Of course my laptop was dialed in with all the drivers and verification software on it that the others might not have.  I did a little research and it appeared it was a problem with the motherboard/ video processor inside the MBP, which had finally been determined was a design flaw, which that meant even out of warranty they would replace the whole board for $300.  Great.  Only they needed 5-7 business days to turn it around.  I didn’t have 5 business days before my flight.  So I did some research.  I found the following link which was very informative.

http://russell.heistuman.com/2010/04/27/cooking-the-books-or-baking-my-macbook-pro-logic-board/

Anyway, the gist of it was that the solder on the motherboard had micro-fractures and needed to be re fluxed.  How do you reflux a motherboard?  well, in the field, you remove the motherboard from the computer, place it on tin foil balls on a cookie tray and bake for 8 minutes in an oven at 375 degrees.  You read that right.  Put your motherboard in an oven and turn it on, and leave it there for eight minutes.  Just like baking cookies. Ok.  Got it.  Anyway armed with this and the fact that my computer had begun to NOT act up I felt reasonably confident going to sea. I ordered some thermal paste from Amazon to have just in case.  It came two days later, well before my flight.

On the high seas

bug at shark week

Volkswagon Shark Cage loaded onto Dive boat

Day 1 at sea: weather is bad (choppy high seas) but laptop works flawlessly, despite less than ideal conditions of constantly rocking boat overfilled with our film crew and gear, an underwater dive crew and their gear, the on-camera dive talent and their dive gear, and the ship’s crew.  Plus cases and cases of stuff.  I had to stick the laptop under a bunk full of gear to shelter it from potential falling cases or it falling itself due to the high seas.  There was nothing I could do about the diesel fumes rolling in from the open hatch behind me though.  Bonine was doing a good job of suppressing my urge to puke all over everything, although it was akin to a sensation of Bonine holding the door shut against ugly ugly illness.  You know it is there trying to get in, but for the moment I was OK.  So far so good.

Day 2: Seas are rougher.  There is a Hurricane off of Florida and we are getting the edge of it.  Mostly no rain, but lots of high seas.  And the laptop starts locking up.  I have to give up and press a PA’s laptop into service.  We make it through the day, which gets called early due to the fact that the seas are predicted to be even worse tonight.  We steam back to port.

At port and into the oven

motherboard 2

MBP motherboard pre-baking

Day 3: we are in port, and standing down for the day.  Time to fix my laptop.  without seas lurching me 30 degrees in each direction I now feel ready to take out the dozens of tiny screws and whatnot that hold my laptop together.  I mean, it’s already dead right? Besides I had read all about this.  There is wifi in the harbor (miracles!) so one last look into the interwebs before I crack open the machine.

I get the screws varying sizes and shaped heads, then remove the mother board and all the various pins and ribbon cables.  I tell the ships cook to preheat the oven in the galley to 375 and ask if she has a cookie sheet and some tinfoil.  She asks me twice what we are cooking, mainly because she thought she misheard.  Had I done this before, she asked.  No.  but I read about it on the internet.

Good news: The ships cook advises me that since we are at port, we are using shore power, which is more reliable, which means we are more likely to actually hit 375 degrees reliably.  Good.  except that 375 is sort of a seat of the pants guess. at least we aren’t heaving back and forth like a drunken amusement park ride like yesterday.

motherboard

375 degrees for eight minutes.

Ok into the oven.  after eight minutes we take it out, and let it “rest” for 20 minutes.  once it is no longer hot to the touch, I grab it and try to remember where all those little ribbon cables and screws go.  Turns out there are two plastic bits I should have removed before the board went in.  they are receivers for some screws that hold the board to the shell.  They melted slightly. One still is functional, but the other no longer works.  No matter, there are lots of other screws holding it on.  I think I will be ok minus one screw. Trying to remember how to apply the thermal paste to the processors, I am generous.  I want the processors to conduct the heat effectively as the inability to dissipate heat was probably got it into trouble in the first place. I get the thing put back together, and with not a little bit of fear, boot it up.  It boots!  But one of the memory cards isn’t working.  No problem, power down and re-seat the memory.  Re boot.  Ok looks good… no wait, no wifi.  Ok I missed a ribbon cable.  take the whole thing apart again and find missing cable and attach.  reassemble.  re-boot.  It lives!  it works. But running Temperature Monitor I see my machine is running hot.  But now I can log into internet and look up how to properly apply the thermal paste.  Oops.  looks like you want to put the thinnest possible layer possible on, using a razor blade.  Ok back into the machine, take the whole thing apart, pull the motherboard again, scrape off the thermal paste and re-apply.  Re- assemble everything.  I am beginning to get very familiar with all the bits.  Re-boot again, and I think I am back up.  No overheating, no malfunctions, and only one screw left over.

I spend the rest of the day sitting on the stern (where the wifi is strong) ordering hundreds of dollars of lubes and cleaners and misc stuff for cleaning all my other gear after I get back home, away form all this corrosive salt water. and a new Gopro, as an overworked PA forgot to close the underwater housing before it went underwater. Oops.

Epilogue

Fast forward to today, and that machine is still working.  It runs perhaps a little hotter than before it’s surgery, but it’s days are numbered due to the speed of Thunderbolt and USB3 rather than any performance failures on it’s part.  It’s a shame because it has been a workhorse.  Even missing that one screw.