A Video-based Interface for
Hand-Driven Stop Motion Animation Production

IEEE Computer Graphics and Applications

  Xiaoguang Han1    Hongbo Fu1    Hanlin Zheng2    Ligang Liu2    Jue Wang3  

1City University of Hong Kong        2USTC           3Adobe Research


Our video-based interface enables easy creation of stop motion animations with direct manipulation by hands, which are semiautomatically removed through a novel two-phase keyframe-based capturing and processing workflow. Our tool is complementary to, and can be used together with traditional stop motion production (e.g., for the rotation of individual faces here).

Stop motion is a well-established animation technique, but its production is often laborious and requires craft skills. We present a new video-based interface which is capable of animating the vast majority of everyday objects in stop motion style in a more flexible and intuitive way. It allows animators to perform and capture motions continuously instead of breaking them into small increments and shooting one still picture per increment. More importantly, it permits direct hand manipulation without resorting to rigs, achieving more natural object control for beginners. The key component of our system is a two-phase keyframe-based capturing and processing workflow, assisted by computer vision techniques. We demonstrate that our system is efficient even for amateur animators to generate high quality stop motion animations of a wide variety of objects.

Download the video (.mov; 38M)

author = {Xiaoguang Han and Hongbo Fu and Hanlin Zheng and Ligang Liu and Jue
title = {A Video-based Interface for Hand-Driven Stop Motion Animation Production},
journal = {IEEE Computer Graphics and Applications},
year = {2013},
note = {Accepted for publication}

We thank the reviewers for their constructive comments, Michael Brown for video narration, Lok Man Fung and Wai Yue Pang for experimenting traditional stop motion production, and Tamas Pal Waliczky and Hiu Ming Eddie Leung for their professional comments. This work was partially supported by grants from the Research Grants Council of HKSAR (No. CityU113610, No. CityU113513), the City University of Hong Kong (No. 7002925 and No. 7002776), the National Natural Science Foundation of China (No. 61222206), and the National Basic Research Program of China (No. 2011CB302400).