Skip to main content Skip to secondary navigation
Journal Article

Animating Human Dressing

results

Dressing is one of the most common activities in human society. Perfecting the skill of dressing can take an average child three to four years of daily practice. The challenge is primarily due to the combined difficulty of coordinating different body parts and manip- ulating soft and deformable objects (clothes). We present a technique to synthesize human dressing by controlling a human char- acter to put on an article of simulated clothing. We identify a set of primitive actions which account for the vast majority of motions observed in human dressing. These primitive actions can be assembled into a variety of motion sequences for dressing different garments with different styles. Exploiting both feed-forward and feedback control mechanisms, we develop a dressing controller to handle each of the primitive actions. The controller plans a path to achieve the action goal while making constant adjustments locally based on the current state of the simulated cloth when necessary. We demonstrate that our framework is versatile and able to animate dressing with different clothing types including a jacket, a pair of shorts, a robe, and a vest. Our controller is also robust to different cloth mesh resolutions which can cause the cloth simulator to generate significantly different cloth motions. In addition, we show that the same controller can be extended to assistive dressing.

Paper   Video

Author(s)
Alexander Clegg
Jie Tan
Greg Turk
C. Karen Liu
Journal Name
ACM Transactions on Graphics (SIGGRAPH), 2015
Publication Date
August, 2015