Skip to main content Skip to secondary navigation
Journal Article

Performance-Based Control Interface for Character Animation

results

Most game interfaces today are largely symbolic, translating simplified input such as keystrokes into the choreography of full-body character movement. In this paper, we describe a system that directly uses human motion performance to provide a radically different, and much more expressive interface for controlling virtual characters. Our system takes a data feed from a motion capture system as input, and in real-time translates the performance into corresponding actions in a virtual world. The difficulty with such an approach arises from the need to manage the discrepancy between the real and virtual world, leading to two important subproblems 1) recognizing the user’s intention, and 2) simulating the appropriate action based on the intention and virtual context. We solve this issue by first enabling the virtual world’s designer to specify possible activities in terms of prominent features of the world along with associated motion clips depicting interactions. We then integrate the pre-recorded motions with online performance and dynamic simulation to synthesize seamless interaction of the virtual character in a simulated virtual world. The result is a flexible interface through which a user can make freeform control choices while the resulting character motion maintains both physical realism and the user’s personal style.

Paper    Video

Author(s)
Satoru Ishigaki
Timothy White
Victor Zordan
C. Karen Liu
Journal Name
ACM Transactions on Graphics (SIGGRAPH), 2009
Publication Date
July, 2009