Skip to main content Skip to secondary navigation
Journal Article

What Does the Person Feel? Learning to Infer Applied Forces During Robot-Assisted Dressing

results

During robot-assisted dressing, a robot manipulates a garment in contact with a person’s body. Inferring the forces applied to the person’s body by the garment might enable a robot to provide more effective assistance and give the robot insight into what the person feels. However, complex mechanics govern the relationship between the robot’s end effector and these forces. Using a physics-based simulation and data-driven methods, we demonstrate the feasibility of inferring forces across a person’s body using only end effector measurements. Specifically, we present a long short-term memory (LSTM) network that at each time step takes a 9-dimensional input vector of force, torque, and velocity measurements from the robot’s end effector and outputs a force map consisting of hundreds of inferred force magnitudes across the person’s body. We trained and evaluated LSTMs on two tasks: pulling a hospital gown onto an arm and pulling shorts onto a leg. For both tasks, the LSTMs produced force maps that were similar to ground truth when visualized as heat maps across the limbs. We also evaluated their performance in terms of rootmean-square error. Their performance degraded when the end effector velocity was increased outside the training range, but generalized well to limb rotations. Overall, our results suggest that robots could learn to infer the forces people feel during robot-assisted dressing, although the extent to which this will generalize to the real world remains an open question.

Paper   Video

Author(s)
Zackory Erickson
Alexander Clegg
Wenhao Yu
Creg Turk
C. Karen Liu
Charles C. Kemp
Journal Name
IEEE International Conference on Robotics and Automation (ICRA), 2017
Publication Date
May, 2017