Skip to main content Skip to secondary navigation
Journal Article

Task-aware Variations

results

Social robots can benefit from motion variance because non-repetitive gestures will be more natural and intuitive for human partners. We introduce a new approach for synthesizing variance, both with and without constraints, using a stochastic process. Based on optimal control theory and operational space control, our method can generate an infinite number of variations in real-time that resemble the kinematic and dynamic characteristics from the single input motion sequence. We also introduce a stochastic method to generate smooth but nondeterministic transitions between arbitrary motion variants. Furthermore, we quantitatively evaluate task-aware variance against random white torque noise, operational space control, style-based inverse kinematics, and retargeted human motion to prove that task-aware variance generates human-like motion. Finally, we demonstrate the ability of task-aware variance to maintain velocity and time-dependent features that exist in the input motion.

Paper

Author(s)
Michael J. Gielniak
C. Karen Liu
Andrea L. Thomaz
Journal Name
IEEE International Conference on Robotics and Automation (ICRA), 2011
Publication Date
May, 2011