As I linked to in my last post, a Twitter user today leaked a number of screenshots from the forthcoming Final Cut Pro X and the as yet unannounced Motion 5. The photos have been removed from TwitPic and the Twitter account that shared them has now been taken offline. Luckily, blogs like Appleinsider and Macrumors Odds are these are legit. Apple previewed FCP X back in April, and the one screenshot that comes from it looks pretty inline with what they teased at NAB.
Apple promised to ship FCP X for $299 in the Mac App Store sometime in June. These screenshots are the first we are seeing of Motion 5. Let’s poke around the images and see what we see.
FCP X Import
- Organizing * Copy files to Final Cut Events folder * Import folders as Keyword Collections * Transcoding * Create optimized media * Create proxy media * Video * Analyze for stabilization and rolling shutter * Analyze for balance color * Find people * Consolidate find people results * Create Smart Collections after analysis * Audio * Analyze and fix audio problems * Separate mono and group stereo audio * Remove silent channels
### Organizing and Transcoding ###
If these really are all of the settings editors are allotted for importing tapeless format, there are going to be a lot of confused customers. My main confusion is under the “Transcoding” menu. We’ve grown so accustomed to the growing number of ProRes formats, are we now to understand that we are given merely two options, “optimized” or “proxy” media?
Moreover, what is a transcode anyway? The first check box implies that FCP X will simply move your tapeless media into the “Final Cut Events folder”. What’s most interesting, for better or worse, is that that is even an option at all. Will this folder operate somewhat like the iTunes Music folder, where it keeps media consolidated if you like it to? Or will you be allowed to roll your own file system structuring. Since FCP X is supposed to leverage every processor in your machine to allow you to cut raw media, what are those transcodes for? How come you can have both “options” and “proxy” media. I’m not too worried about these options being set out the way they are, but I can say, if this is real, they raise a lot of questions about what an FCP X workflow actually looks like.
Most of the video settings we see here are exactly what was teased at NAB, and they are a very good sign. All of the seemingly superfluous, automated doodads (which may actually turn out to be life savers) that Apple is pushing hard can be turned off. Many editors, after the original demo, were worried that FCP X was going to start automating the entire cutting process, coloring video you’d rather leave raw or getting rid of rolling shutter even if you want to keep it. This screengrab implies that those processes won’t happen on ingest. Instead, FCP X has the ability to “analyze” the media so that you can flip a switch later to implement any quick changes.
Analyzing your footage for rolling shutter, making it easier to correct later, can be a great feature, but it can also be a huge strain on your processor. Though Apple says it will happen in the background, it’s likely that many of these processes will slow down your cutting while the media is ingesting. So it’s a very good thing you can turn off color and “people” analysis; especially “people” analysis. I turned on face detection for four years worth of photos when I first got Aperture 3. Not only did it take the better part of a day to complete the process, but it ground Aperture to a halt while it chewed through the photos. Thank goodness this is optional.
I’m actually most excited for the check boxes listed here. The only one that is scary is “Analyze and fix audio problems”. That’s like saying to the computer to “make me dinner”; even if it pops out something edible, it probably won’t be what you wanted. This kind of vague automation will prove itself either useful or useless over time. Perhaps there’s another dialog in which we can pick what fixes we would like to apply.
Still, that “Separate mono and group stereo” looks pretty great. I wish there were a way to automate this into tape-based workflows. I’ve seen a lot of mistakes made because people just don’t know how the tracks were recorded. Let the computer sort that out. And that “Remove silent channels”? Bye bye 16 extra channels of DVCProHD.
I realize I’ve already been long-winded about a single supposed Final Cut Pro screenshot. The good news is I have way fewer opinions about a new rev of Motion. Let’s start with the best one:
Here are the options users are presented when creating new Motion projects:
- Motion Project * Final Cut Effect * Final Cut Generator * Final Cut Transition * Final Cut Title
Now this could get fun. It’s starting to look like there is some deep integration between Final Cut Pro and Motion. Anything effect generated in Final Cut can now be created by you and your team. The sky could be the limit, but it’s unclear how this would work exactly. For example, a “Final Cut Effect” would presumably be a color or stylized type of overlay for video clips. Do you have to create these for a single frame size and frame rate? Will HD effects work on SD footage and vice versa? The same goes for transitions.
If this screenshot and it implications turn out to be true, then perhaps Motion really has swallowed up Color. A Final Cut Effect could potentially act as a sort of color grade or LUT that could be dropped on whole scenes. Or it could be something we haven’t even thought of. Speaking of Color.
This image is also supposedly from the new Motion. This would seem to be pretty good proof that Color is now a feature within Motion 5. While there is certainly need for a complex Waveform monitor in a motion graphics application, this looks fairly complex, as does the waveform of whatever image was loaded into the app when this was taken. Even if this waveform monitor is only in Motion, it would seem to make the functionality of the two apps redundant. Why not split them? Better, why not put Color right into Final Cut?
I love Color in its current form. not only is it incredibly powerful, but it also promotes discipline among cutters. I know it’s not the best argument for it, but by keeping Color and Soundtrack Pro as separate apps, cutters have been forced to follow a traditional editorial model and actually lock picture for good. Non-destructive editing systems have made tweaking so easy that an edit could never end. Technically, you could always send a project to Color, so some work, then bring it back, cut some more, then send it back out, but it was a pain in the ass. It would have been the same pain in the ass to go to a supervised color correct in telecine more than once. So we had to learn discipline.
If Color has now been integrated into another app, that will make it even more efficient and allow for quicker turnarounds. Maybe it’s just me who will be tempted to tweak color, then tweak the edit, then tweak the color again… In fact, that doesn’t sound too bad. I hope we get to see the real thing soon so I can stop with this wild speculation.