After Effects Javascripting Project – Mocap?

I have always been fascinated by the art of motion capture– as seen in films and video games- and, have always longed to try it for myself. Unfortunately, motion capture setups don’t come cheap. So I’ve always been on the lookout for homemade rigs that might allow me to experiment. Precious few exist.

Last year I acquired a copy of Adobe After Effects CS5 which I have been busily exploiting (see this page for a sample). After Effects has a tool called “motion tracking” that enables the user to track moving objects in two dimensions. The first application that popped into my head when I saw this was motion capture. If you’ve ever used After Effects, you will know just how impractical this idea truly is. I am now intimately aware of this fact. But the idea has never quite left me. A few weeks I was delighted to discover a scripting editor in After Effects which uses Javascript. I immediately thought this might be the key to extending AE’s motion tracking feature to its full potential.

For the purposes of mocap, manipulating keyframe data inside AE is useless. I needed a way to export the information to a 3d editing application, in which I may animate armatures and meshes. Blender was my first choice as it’s most familiar to me.

Once I got the basics of AE’s Javascript, I was able to write my code. When executed, it proceeds to collect keyframe data from the specified layers. Then it records the coordinates on each frame and stores them in python lists. Depending on the length of the source video, these lists can be quite long. The resulting block of thousands of numbers can be quite daunting but, the exported script is not really meant to be seen or edited. It need only be opened in Blender’s text editor and executed.

Ta-dah! In Blender, empty objects are are spawned with keyframes applied (one empty per exported layer). Each empty receives location, rotation, and scale keyframes from AE. A camera is also created with the same resolution as the source AE composition. The empties are scaled and moved to the proper distance from the camera so they may line up with the video from which they were originally generated. That last step isn’t quite perfect. I had to estimate a little in my calculations for focal length. It would seem that Blender’s camera settings are not, and never have been, based off real world cameras. I believe the main issue lies in that Blender’s camera has an aperture of zero (I think) which is impossible in real cameras. Anyway, I had to calculate by hand, and it still didn’t quite match up.

Thus, with the help of this export script, I have my own little motion capture rig (at least a two dimensional one). I may now attach armatures and meshes to the resulting keyframed points and composite those digital elements over the original video. The ability to apply all that keyframe data to armatures in Blender is very exciting (to me). It’s something I’d never be able to do in After Effects- not even with plugins.

NOTE: It is also possible that this project may develop into a kind of simple match-moving tool in future. Stay tuned!

  1. February 2, 2012 at 3:13 pm

    This is very interesting. I’ve been looking for a script to export AE camera tracking data to Blender but all i can seem to find is stuff about Blender to AE, but very little of the other way around (AE>Blender).

  1. June 2, 2011 at 9:42 pm

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: