Categories
Blog Posts

Creating Animation In Real-Time

In 2016, The Simpsons used animation software from Adobe to incorporate a live segment into an episode.  During the segment, fans of the show called in and asked unscripted questions to Homer, who answered – while animated – in real-time.

The Future of Animation?

In 2016, The Simpsons used animation software from Adobe to incorporate a live segment into an episode.  During the segment, fans of the show called in and asked unscripted questions to Homer, who answered – while animated – in real-time.

This was done using a piece of still-in-development software called Adobe Character Animator.  Character Animator is included with current versions of After Effects, but launches as a standalone program.  It uses web-cam point tracking to map facial movements in real-time.  Mouth movements are translated to pre-built shapes, that change and move with the action of the actor.  Other gestures, such as blinking or body movements can either be tracked live or mapped to buttons.  When a button is pressed, the corresponding gesture is made.

For the segment on The Simpsons, Homer’s voice actor (Dan Castellaneta) responded to questions will being tracked by a camera monitoring his facial movements.  Other gestures were controlled by the episode director using a keyboard.

You can read more about the Simpsons live animation segment over at Cartoon Brew.