Daniel Phelps


An Ode to the “Professional” Editor #FCPX

This post has been cross listed by me from the York Comm Tech Blog at yorkcommtech.net.

On April 12, 2011, Final Cut Pro saw its first major redesign and update since 2001… Even earlier if you remember the program as Key Grip (the name of the program at Apple purchased as the foundation for FCP in 1999).

As I tuned into the twitter feeds of #FCP last night to experience the cumulative reaction of the 1700 people at the FCPUG users SuperMeet at NAB, I found myself to be outside of the “Reality Distortion Field” normally found in Apple keynote addresses. You see, unlike the next iPhone or the next iPad, Final Cut Pro has been theprogram that I have made my living for the past 10 years. It has been my lifeblood, a passion, and the one piece of software that I can truly say I’m an expert with. What makes me so comfortable with this program I know how to fix this bugger when it breaks. I know how to avoid problems with this clunker because of known bugs, and more importantly I can navigate this program with an efficiency that takes years to develop. It is another thing altogether to create something wonderful with this tool.

But as I utter these words, I realize that there are many other programs I rely on to do my job. Microsoft Word, any e-mail program, any operating system… These are all just cogs in a greater skill-set to do my job effectively and efficiently. By owning Microsoft Word I don’t call myself a professional writer. By knowing how to navigate WordPress, I’m not a professional blogger. And by owning a hammer and knowing that I can swing it, does not make me a carpenter. Anyone can give themselves a creative title until they have to build “it”. Only then, can someone can be called a “Professional”, whether it be a writer, a blogger, or Jesus.

With that said. If Jesus have been an editor, he would use Final Cut Pro X.


In my analysis of this announcement, I will look at this the changes to FCP from two different perspectives. One from an editors perspective, and the other from the role of an Educator and Multimedia Systems Administrator. Additionally, because this announcement was a “sneak peek” I will not recap the feature-set. For a complete list of changes to the program thus far, or to watch the Keynote address please visit the following links:

Feature List: http://aol.it/gHtcHz
Keynote Video: http://bit.ly/i9AFJx

From an Editors Perspective:

Forgoing the easy analysis of Final Cut Pro X as “iMovie on Steroids”, I truly believe Apple is trying to accomplish many goals. One of the obvious goals in the demo was efficiency disguised as making things “easier” for the editor. Every single new feature that was demoed is intended to make finding your media faster, and implementing decisions quicker.

As the demo for Magnetic Timeline and Compound Clips was happening, I was counting the steps or “clicks” that I would no longer have to do for insert editing or choosing b-roll/environmental. Those who perform many repetitive actions to nested sequence of over 700 clips know what I’m talking about. Those micro-tasks add up quick,  and after seeing FCPX “editing during ingest” of h.264, I was sold. Those features alone can save hours.

The Twitterverse might ask, “So what about the UI?”.. Well to that I say, “Meh”. UI’s change. They become more efficient, especially in Apple’s world. So what if it looks like iMovie. Spend a week with the program and learn how to be more efficient by starting over. You’re a professional, aren’t you? You’ve done it before with almost every other program you have. Grow up.

There is no way that this upgrade will make FCP less powerful. It provides the underpinnings to an exciting, powerful future where any format will “just work.” Isn’t that how we want all of our software to do… Just work with what we want? Apple is leveraging it’s core technologies (Open CL, Grand Central Dispatch, and Core Animation) to make things in FCP “just work.” From DSLR’s to flip cams, to legacy codecs (I’m looking at you DV).. These decisions [by Apple] will make editing easier for everyone because it strips the high-level of understanding away and makes the technology invisible.

Knowing how to fix compression problems with mpeg-2 will always come in handy, but if those problems are not there in the first place because the software “took care of it”, who’s to know, and who’s to care? Producers only care about the final product. As an editor/producer I look forward to a more efficient and seamless FCP experience so that I can concentrate on creativity and story, not codecs and metadata. Jesus can do my offline.


From a Systems Administrator/Educator Perspective:

If this version of Final Cut is not adopted by the professional community, FCP is dead as a Pro App. Lots of people will use it, but a toy it will be.

The decision to continue with Final Cut in the classroom will only be decided if it is taken seriously in the post-production world. Will the new price ($299.00) scare post-production houses away? Or will producers ask for FCPX by name, to keep costs down? The assumption 10 years ago was that because FCP was cheap and accessible, it would require lower post-production costs and “less-skilled” editors. Producers quickly realized that only the former turned out to be true. But the “non-professionals” still called themselves shooters and editors… with their DVX-100’s and XL-1’s, they pointed their Quicktime Export to YouTube, starting MANYcareers in the process. Consumers became producers with technology that was now more accessible.

I owe some of my career to my basement and my XL-1. My early knowledge and adoption gave me a leg up at the first of many small studios that I have worked. This (DV+FCP) was technology that caught salty Media-100 and Fast VM/Linear “Professional Editors” off-guard, and changed an industry in 8 years. I don’t believe this version (of FCP) will make waves like it did in the 2000’s… but you will see a new generation of filmmakers with DSLR’s and Macbooks creating stories with a different (read: more advanced) aesthetic than DV and our G4’s hurled at the industry 10 years ago.

So will I install FCPX in the classroom? Only if it benefits the students professionally.. and that depends ultimately on how you define a “professional”. This will take time. But given the current economic condition of education right now… There may be a clear choice. Adobe, and especially Avid, are more nimble and affordable than ever, but a 4G modem, a Macbook preloaded with FCPX, tied to a Canon T2i, for under 2K?.. Well that sounds like anyone, anywhere, can tell a professionalstory.

I say as an editor, use it immediately. If you like it add it to your toolset and move on. If it’s more efficient, tell producers that you can get it done faster in FCPX. Charge less, make more.

As a systems administrator. Wait a year. Look at the details. Call production houses and media outlets to see what their plans are. They will tell you their direction. The popular vote wins. (hint: most provide multiple NLE’s)

If you have any questions or comments, please feel free to let me know. You can also find me on the twits @danielphelps, or  email me at dphelps{at}york.cuny.edu



Because I Row

I was lucky enough to DP this commerical for the most excellent Row New York. The commertial debuted on Universal Sports in 2011.


[two_third_last]Because I row I have confidence…
Determination …

I know no limits…

Because I row I will achieve great things.

Because I row, I’m strong…
I have muscles…
I’m mentally tough.

I’m hungry all the time.

Because I row, I know teamwork…

I trust…
I believe…
We are all here for one another…

Because I row, I am never alone…

I laugh all the time…

Because I row, we are family…

We are Row New York.[/two_third_last]

The iPhone 3D Viewer (2011)

The significance of this attempt is most critical when judged against my professional growth. The iPhonoscope was the start of creating and employing new creative tools into all of my work.
This project was founded under the idea that we would use our phones as 3D viewers and augmented displays. The patent was filed well-before Google Cardboard and allowed me to explore the trials and tribulations of the patent process. Ultimately I abandoned the Provisional I was granted due to the prior art that preceded this submission my 60 days. I would later stress that all of my work produced would be either open source or granted a CC International 4.0 license with attribution.
There is freedom in free.

Phelps_Provisional_Patent PDF LINK


Description of Project in 2011: Project was abandoned shortly after Provisional Patent was aw

I have built three cheap and easy devices to produce, and display, 3D stereoscopic moving images. All of these devices use 19th-century techniques  (Stereoscopy) in conjunction with ubiquitous 21st-century consumer hardware (Apple Inc.’s iPhone).

At This Stage, I intend to develop and produce:

  1. An economical paper version of my iPhone Stereoscope for distribution to view my works on the iPhone. (Relationships with an established 3D paper glass company have been established. American Paper Optics, LLC, www.3dglassesonline.com)
  2. Create 3D works related to the field of nature conservation. Due to the proliferation of the iPhone in the urban environment, I feel that the nature of 3D would benefit those lacking awareness of the world outside the city.
  3. Distribute these 3D works via a custom mage iPhone application via the iPhone SDK, and the iPhone’s own built-in YouTube application.
  4. Encourage and creation of an online community for user submission of works online, to be viewed on my simple device.


My device and distribution method uses networked portable devices as a playback medium. Although any handheld device capable of producing a moving image would be able to reproduce a stereoscopic effect, I have chosen the iPhone as the playback device to mock-up for this project because of its network connectivity, popularity by tech enthusiasts, and its ease of use by the consumer. As delivery medium changes for the consumer, the basic idea of stereoscopy remains the same as it has for 120 years… Look at disparate images through a viewer to create an added dimension and heightened reality.

To better develop this idea of creating modern stereograph equipment for enthusiasts, I have broken this project down into three parts:

Recording: I have built two devices capable of producing a 3-D image using standard consumer video cameras.

The large device was produced to test the use of my mirror concept in creating the 2 disparate images seen in stereoscopic photography. In addition to saving money by only using only one video camera to produce a 3D image, this device provides a challenge to the “Do it Yourself” enthusiast. This device requires a little editing to the recorded image but is more difficult to build. Material costs are a little over $40.00 and can use any camera that is capable of producing a 16×9 (widescreen) image, has manual focus, and can record in a progressive (30p) frame rate. In real-world testing, 3D quality of this device is acceptable.


The smaller and more simplistic device uses two cameras to record a stereoscopic image. This design incorporates the more traditional approach to stereo photography and differs from the above device due to its lack of mirrors and build complexity. The device is cheaper and simpler to build but more expensive in its total cost due to its use of dual cameras. In addition, because two separate sources (cameras) are used, as opposed to one with the larger device, more editing time is needed to prepare the images for playback on a portable device. In real-world testing, 3d images from this device are excellent.

Both devices produce stereoscopic images that can be played back on a portable device. Although they produce the images with a varying degree of accuracy and affordability, the devices are useless without a popular playback device. I believe that with technological advancements, and the popularity of 3D in the future, recording of digital stereo images will become easier to produce and playback.


Playback: I have used a simple pre-fabricated “Pocket Stereoscope” to attach to the iPhone. This device provides the eyes, and the brain, with the cohering, needed to produce the stereoscopic effect required for adding dimension to a 2D image.

This stereoscope provides the correct focal length for the iPhone and does not require adjustment for those who don’t use glasses. To accommodate the different focal lengths for people that do require glasses, a traditional focal length adjuster is needed for the next version of the playback device.

The intent is to create a cheap (read 10 cents) version of this viewer similar to the way traditional 3D glasses are made today.

The iPhone is not the only device that can potentially play back a stereograph. These same consumer lenses could be applied to other mobile phones, iPods, Portable DVD Players, PSPs, and any other device with a moderate pixel density that is capable of displaying full-motion video.

Distribution: The iPhone was chosen for this project because of its integration with YouTube. With the built-in iPhone application, users can share 3D videos and experience other videos recorded from other users around the world. YouTube also eases the burden of creating videos for any device. The stereoscope user has access to a diverse collection of videos without actually owning a 3D camera of their own. This networked use is the heart of the system and provides something that Holmes’ original stereoscope system did not, near instant 3D viewing of anything on the planet nearly anywhere on the planet.


A device with network connectivity has a big advantage over one that uses a physical format (i.e. DVD, Flash media, Hard Disks) because of the amount of video that can be made available to the user. Distribution outlets will grow as all information devices are networked.


Other uses for these designs: To encourage popularity of these devices one could distribute designs under a creative commons license. This method could spur the “DIY” community to embrace and improve on the technology. Both design concepts (camera and playback) could be improved upon and applied to current and new digital technology. I also believe because of the ease of networked distribution, users would create groups or “pools” of videos for their community, and the world, to enjoy and experiment with.


Instructions for use:

Located within the link below, you will find sample footage shot with the 3D devices in addition to images of the devices being created.

The video is formatted for use on the iPhone or and QuickTime device. It can be uploaded to the iPhone through iTunes and used with stereoscopic glasses, or a pocket stereoscope (not provided) at the correct focal length.

You can also view sample video created by the recording devices directly from the iPhone Youtube Application located on the Home screen. If you have access to an iPhone, please do a search for “Phelps iPhone 3D Test”. Or go to the link below:

* Please note that streaming the “Phelps iPhone 3D Test” sample video will require a WiFi connection to be viewed at full quality. If viewed while connected to the cellular network, the video will be degraded immensely. Although… the effect of highly compressed 3D footage is quite spectacular. Various degrees of compression have been applied to the test footage as a way to measure its effects on the perception of depth.

Uncle Sam (Remixed 2009)

If you’ve ever read Uncle Sam (The Illustrated Novel) you’ll appreciate the re-appropriation of this one. Starring 2008 political characters.

Scroll through the book or view as full screen.