Multitouch Gestures for Constrained Transformation of 3D Objects

Stacks Image 205
Stacks Image 217
Stacks Image 221
Stacks Image 219

3D transformation widgets allow constrained manipulations of 3D objects and are commonly used in many 3D applications for fine-grained manipulations. Since traditional transformation widgets have been mainly designed for mouse-based systems, they are not user friendly for multitouch screens. There is little research on how to use the extra input bandwidth of multitouch screens to ease constrained transformation of 3D objects. This paper presents a small set of multitouch gestures which offers a seamless control of manipulation constraints (i.e., axis or plane) and modes (i.e., translation, rotation or scaling). Our technique does not require any complex manip- ulation widgets but candidate axes, which are for visualization rather than direct manipulation. Such design not only minimizes visual clutter but also tolerates imprecise touch-based inputs. To further expand our axis-based interaction vocabulary, we introduce intuitive touch gestures for relative manipulations, including snapping and borrowing axes of another object. A preliminary evaluation shows that our technique is more effective than a direct adaption of standard transformation widgets to the tactile paradigm.

Stacks Image 199
Paper - pdf (2.9 MB)
Stacks Image 202
Video - mp4 (45.7 MB) - YouTube

author = {Oscar Kin-Chung Au and Chiew-Lan Tai and Hongbo Fu},
title = {Multitouch Gestures for Constrained Transformation of 3D Objects},
JOURNAL = {Computer Graphics Forum (In Proc. of Eurographics 2012)},
PAGES = {651-660},
YEAR = {2012},

The authors would like to thank Lu (Arthur) Chen for his help in implementation and video editing, Michael Brown for video narration. We appreciate the helpful comments from the anonymous reviewers. This work was supported by a grant from the Innovation & Technology Fund of Hong Kong (project ITS/117/09). Oscar Au was supported in part by a grant from CityU (No. StUp7200266). Hongbo Fu was supported in part by grants from CityU (No. SRG7002533), and the HKSAR Research Grants Council (No. 9041562).