Epic MX vs Epic Dragon.

I just recently got my RED Epic back after RED installed the new “Dragon” chip.  I borrowed an Epic that still had the MX chip and shot side by side tests to see how much better the Dragon chip actually was.  I found a few surprises.

First off, I updated both cameras to the current release builds 5.1.51 and black shaded both of them after cameras had reached operating temp at adaptive at 65c.

IMG_4748the weather was a bit unsettled (as it always seems to be when I seem to have time to do these tests) so I decided to put on a 35mm Red Pro prime on the Epic MX and a 50mm RPP on the Dragon and move the cameras a bit to mimic same frame size of midground so I could roll both at the same time so exposure would be identical.  I set them for the same stop, which if I recall was something like f 8 1/3.  according to the false color overlay the Dragon had more info in the highlights before clipping.  In fact a small patch of white sky which clipped on the MX was apparently not clipping in the Dragon. I shot myself in my garage workspace that has diffused top light and a nice window for testing overexposure.  If I shoot in the afternoon there is a piece of an apartment building across the street that gets hit with sun, providing an excellent detail/ overexposure test element.  People have complained in the past about RED’s rendition of skin tones.  I think this is wildly overblown.  I have never had an issue with it, at least under daylight color temps, which is how I shoot the majority of my stuff.  The Dragon is supposed to be much better.  I did not notice much of a difference, nor did I see any problems with the old one.  One note is that with the new chip comes some new settings in REDCine, the app that lets you “develop” the Raw footage from the camera.  before Dragon there was “Redcolor 3” for color rendition, and “REDgamma 3” for gamma.  With Dragon there now is “Dragon Color” and “REDgamma 4.”  while I think you should use Dragoncolor for rendition off a dragon chip, don’t be fooled into thinking REDgamma 4 is automatically better.  It tends to be a bit crunchier than REDgamma 3 which under normal circumstances helps make a punchier image, but when testing over under exposure range, may fool you into thinking the Dragon chip has even less range than the Regular Epic.  Of course, professional graders probably would just use REDlogfilm (which is flat and holds onto the most range) and make their own curve based on the scene. but in this case I wanted to do a relatively unbiased side by side test.

screenshot from REDcine.

screenshot from REDcine.

Below are some interesting screen grabs:

COLOR RENDITION:

Side by Side Dragoncolor Vs REDcolor3

Side by Side Dragoncolor Vs REDcolor3

Dragon on the right, Epic-MX on left. Dragon set to Dragoncolor Epic MX set to REDcolor3  More detail in Dragon, and slightly less red tint to the skin.

HIGHLIGHTS:

Now here is a detail of the highlights: Dragon on the right, Epic on the left.  Both set to REDgamma3. slightly more highlight detail on the Dragon chip as expected.

RG3 Dragon vs MX
Don’t be fooled into using REDgamma 4 just because it is better.DRAGON RG4 VS EPIC RG3 Here is the same footage

but Dragon chip is rendered at REDgamma4.  It is very hard to tell which chip has the advantage in this scenario.brick looks about the same, maybe slight advantage Dragon, but leaves look hotter on the Dragon.

Now here is the same thing in REDlogfilm. You will see the Epic MX chip clips magenta showing the white is clipping.  Not so in the Dragon. as you might expect.

Epic-MX vs Dragon in REDlogfilm

Epic-MX vs Dragon in REDlogfilm

BIG SURPRISE: NOISE LEVEL:

This is Epic MX on left and Dragon on the right

This is Epic MX on left and Dragon on the right. both at 5600K, 800 ISO, 8:1 compression.  Dragon at 6KHD and Epic-MX at 5KHD.

 

This was the big surprise. The Dragon Epic was sharper, as was to be expected as it’s max resolution was better than the MX (6KHD in this case vs 5KHD) but there was more noise and pattern in the Dragon footage.  This was alarming as a year ago the Dragon chip had been advertized as 2000 ISO was as quiet as 800 ISO in the MX chip. What the hell? I paid over 8 grand to get MORE noise?

I went over to REDUSER.net and fund someone else had done essentially the same test as I had done and got the same results. if you are into flogging yourself, here is the link:

http://www.reduser.net/forum/showthread.php?117701-DRAGON-vs-EPIC-MX-Noise-NOT-GOOD

CONCLUSION:

Anyway the long and short of it was this: It is the new OPLF, along with black shading that is to blame.  When Dragons first started shipping to anybody (maybe December 2013?) they had what I will call version 1 of the OPLF (Optical Low Pass Filter) which is essentially a piece of custom class/filter in front of the sensor proper to improve performance, reject IR and limit moire.  All cameras have some kind of OPLF.  after logging some hours with it in the hands of users the following characteristics were found.  It did in fact seem pretty clean at 2000 ISO.  But there were weird magenta flares, which were very visible in low light situations.  A new OPLF was designed that got rid of the magenta flare, improved highlight performance by at least a stop (I have no idea how they did that, possibly in conjunction with new black shading algorithm?) and even better IR performance.  Downside? now 2000ISO is noisy.  And the whole thing blew up on REDUSER once everyone was getting new Dragons in numbers with the new OPLF.

The good news is that RED will, if you wish, put the old OPLF back in your camera, if that is what you wish.  And they claim to have a firmware build in the works (Don’t they always)  that will tweak black shading implementation that will address this problem, and that it will come out soon, possibly in a week.  But knowing RED that could mean a month.  They claim that the new OPLF brings so much to the table that they think the future is in the new OPLF with software tweaks. There even has been talk of user replaceable OPLFs in the future. I can confirm that the IR contamination on this new arrangement is quite good.  I had a shoot with an n1.2 and pola (6 stops of ND essentially) and a black dress Marine uniform in full sun and the uniform stayed black.  unfortunately I had shot it at 2000 ISO because I had believed that 2000 was the new 800, and that shooting at a higher ISO protected your highlights more. This had been true of the RED One, but is no longer true if RED Dragon.

Also, I rendered both the Dragon and the MX footage to a 1080 ProRes file.  the noise difference was indiscernible, which was a pleasant surprise.  I did not use any special noise reducing options on either.  So for the moment I am content to wait for the new firmware and black shading.  Even if it makes all the testing above obsolete.  But as always, test test test.  Sometimes you find a surprise.

 

Color Charts, Calibration Targets and Mars

All cameras lie. At least a little bit. For that matter so do our eyes. What we want in most cases is a camera to perform in the same characteristics of our eye, although even that can be subjective.

Gretag-Macbeth_ColorCheckerBack when I would work as a Camera Assistant we would often use a “Macbeth Colorchart” at the head of each scene, and sometimes at each roll of film. It would act as a known “control” that the post production people would know, and therefore be able to tweak color and contrast so that the chart looked like it did to our eye. This was important because color negative film meant that a positive needed to be made and at that step changes could be introduced and of course the goal would to be not to introduce any unwanted changes. Essentially It was “re-exposing” the film, and the chart gave the film lab something to go by on what the cinematographer wanted. Sometimes these charts were shot under “white” light (i.e. light color balanced to the film stock) and then only after it was photographed were gels to change the color of the light applied for the scene to be shot. The goal here was to communicate with the lab, “Just because I put blue gel on the lights doesn’t mean it is a mistake I need you to fix, I want you to “time” your color to the light I shot the chart under, so that my desired color cast is achieved” resulting in this case to a blue cast to he scene.

As the shift to video happened, Macbeth Charts began to lose ground to charts like DSC labs Chroma Du Monde which was more useful when used with a Waveform and Vectorscope, video engineering devices not used in film, but prevalent in video production. according to the DSC labs Website these charts were originally designed by the “US Space Program” which made me think of a color chart I had seen recently photographed in a fairly remote location. “Bradbry Landing” to be exact. No, this isn’t a BBC sequel to “Downtown Abby”, it is the landing site of the Mars Rover Curiosity, or more accurately Mars Science Laboratory (MSL.) It has 17 cameras onboard, some for hazard navigation, some for scientific research. The HazCams are black and white so color reproduction is irrelevant. But some of the cameras shoot in color. We all know Mars is the “Red Planet” but what if you want to correct out that color cast? Well, you shoot a color chart of known colors. NASA calls this a Calibration Target as it does more than just color. Nevertheless it is a pretty simple device. It has 6 color samples. Red, Green, Blue (3 primary colors) 40% gray and 60% gray, and a fluorescent pigment that glows red when hit with ultraviolet light. Pretty simple, especially when you look at the complex charts that DSC produces. I imagine the heavy engineering of the camera’s performance was done here on earth, and this simple chart is just to analyze color cast like that on Mars. The descending bar graphic is adapted from the US

Mahli Calibration Target

Mahli Calibration Target

Air force for judging camera resolution. And below that is a 1909 Vdb Penny. what is a coin doing on the chart, you might ask. This chart is mainly for the MALHI camera, which is essentially for close up work, essentially a geologist’s eyes. The penny is a nod to the common practice of a geologist placing a known object within the frame to show scale of object being examined. Rulers work, and are perhaps more scientific, but in choosing a penny NASA is showing a bit of whimsy, something not that bad for a big governmental science and engineering branch to have. Perhaps something we should all keep in mind.

Why a 1909 Vdb penny? The first year the “Lincoln head” penny was produced was 1909 and 2009 was originally the launch date for the rover, and the 100th anniversary made it a good choice apparently. Ultimately the Rover’s launch got delayed to 2011, but by then the decision was made. “VDB” are initials on the bottom of the coin indicating the initials of the designer: Victor D. Brenner.

I find this especially interesting as when I was a kid I dabbled in numismatics, or coin collecting. I got started with some silver quarters my parents gave me, but I remember vividly looking through all the pennies I got over the course of however many months and finding three or four 1909 Vdb pennies myself. At the time they were valued at about $2. Now they are about $15 on ebay for average condition. There is also a 1909 Vdb S penny, which was minted in the San Francisco mint and has a “wheat” back. Those are quite rare and are worth at least a thousand dollars today. Who knows, you might have one in your pocket right now. And you thought pennies were worthless.

 

MastCam

MastCam with fixed 34mm f 8.0 lens Notice the Swiss Army knife for perspective. Much like putting a penny next to the object for scale

Now it is not clear to me whether this Calibration Target is available to the other cameras on the Rover, but I think so. The other main camera on the Rover is the “Mast Cam” which provides a human height perspective from Mars and can even capture footage in stereo. The Mast Cam uses the same sensors as the MAHLI does.
The sensors are 1200×1200 pixel (2 megapixel) Bayer pattern sensors. The Mast Cam has two cameras, a “wide angle” (15 degree field of view) 34mm f8 lens with a minimum focus of 2.1 meters and a “telephoto” (5.2 degree field of view) 100mm F10 lens with same minimum focus. Together they can shoot stereo, although with the mismatch in focal lengths means this is only a bonus feature rather than a primary function. Each camera can do 720p video at about 10fps. And for us camera nerds, it has ND filtration as well as IR cut filters (for study of specific wave lengths more than IR contamination I suspect.)

One thing that instantly occurred to me when I first was listening to news reports of Curiosity’s landing, was how the hell do they keep dust of the lenses? Dust has an affinity for front elements here on Earth and I imagine it is only worse on Mars. I don’t know if Curiosity has the ability to blow off dust of its lenses, but I did find out they have an ingenious design to their lens caps. They are transparent. This means that they can take pictures through them if conditions are unfavorable. The optical quality suffers, but the lens is protected. When conditions are clear, they can remove the lens caps for clearer pictures. And like any good photographer, the Rover probably keeps her lenses capped when not in use.

What does this have to do with Calibration Targets and color charts? Well, it helps us get images like the two below. One is “un-white balanced”, and the other is “white balanced” and represents what the surface would look like under Earth lighting. By shooting the Calibration Target first, the engineers can “dial out” any color cast created by the Martian atmosphere. The uncorrected shot, if it had a calibration chart in the frame, would have colors that would not look correct. In the “white balanced” shot, all the colors on the target should look true and accurate, the same as they did back at NASA before it launched.

Uncorrected "RAW" panorama of Mars

Uncorrected “RAW” panorama of Mars would look like if it were on Earth.

Corrected "White Balanced" version of the same shot

Corrected “White Balanced” version of the same shot

If these pictures are too small to your liking, here is the link to NASA’s page that has links to some super high res versions. Now I don’t know about you, but I find high res panoramic photos of a foreign planet pretty cool.

So what does this all mean? Well, for one, I plan to tape a 1909 Vdb penny to my Chroma Du Monde chart. Why? there are several reasons. I want to be reminded that whenever I pull my color chart out that somewhere on a different planet millions of miles away, there is a similar chart being used to aid in photography. The penny also reminds me that in the most ordinary mundane things, like a penny, there can be surprises if you just look carefully enough. A good thing to keep in mind in life, as well as photography. Who would have thought that the lowly penny would be the first currency to arrive at another planet? And lastly, no matter how big or important the job is there is always room for some whimsy.

I feel getting a copy of the only coin (of any currency) that is on another planet for $15 on Ebay is a deal if ever I saw one. And in the meantime I am going to start checking my pockets more often. Who knows, there could be a 1909 Vdb S in there with all that loose change.