Welcome to The Movement Lab!
The goal of our lab is to create coordinated, functional, and efficient whole-body movements for digital agents and for real robots to interact with the world. We focus on holistic motor behaviors that involve fusing multiple modalities of perception to produce intelligent and natural movements. Our lab is unique in that we study “motion intelligence” in the context of complex ecological environments, involving both high-level decision making and low-level physical execution. We developed computational approaches to modeling realistic human movements for Computer Graphics and Biomechanics applications, learning complex control policies for humanoids and assistive robots, and advancing fundamental numerical simulation and optimal control algorithms. The Movement Lab is directed by Professor Karen Liu.

Learnable Physics Simulators for Humans, Robots and the World
Physics simulation is increasingly relied upon to predict the outcome of real-world phenomena. The rise of deep learning further augments the importance of physics simulation for training intelligent robots and embodied AI agents in safe and accelerated simulated environments. Our lab has created a number of physics simulation tools and algorithms that leverage both differential equations and measured data for building accurate simulation models of humans, robots, and the world they interact with.

Personalized Predictive Human Models
Developing personalized predictive human models to simulate human movements has a wide range of applications from accelerating sport medicine to designing exoskeletons to creating VR avatars. We aim to develop generalizable and computationally tractable models of natural motion with minimal amount of engineering effort. Our predictive human models can simulate a wide range of scenarios, while providing the option to be personalized to specific individuals using only a moderate amount of data.

Physical Human Robot Interaction
AI-enabled robots have the potential to provide physical assistance that involves applying physical forces to human bodies. This capability will transform healthcare in our aging society and allow older adults to be independent, stay in their homes longer, and have better quality of life. We develop simulation tools and control algorithms to facilitate research in the field of pHRI. We aspires to build intelligent, safe, and ethical machines that enhance our sensing and actuating capabilities, but never take away our autonomy to make decisions.

Dexterous Manipulation in Ecological Environments
Dexterous manipulation is a hallmark feature that enables humans to carry out day-to-day tasks in complex ecological environments, such as using a screwdriver, opening a water bottle, or folding bed sheets. To highlight the advantages of humanlike hands, we are interested in manipulation tasks that can only be achieved by utilizing multiple contacts, both intrinsic and extrinsic, with real-world objects, both near-rigid or deformable. We develop control policies utilizing non-trivial contact strategies for embodied AI agents and real robots.
Lab News
-
TML introduces a new open source software, AddBiomechanics
Upload mocap files and get an optimally scaled OpenSim model and IK back in minutes. Share your data with the community. Browse and download biomechanics data.
September 14, 2022
-
Zhaoming Xie
Welcome to Dr. Zhaoming Xie joining TML as a postdoctoral fellow!
March 23, 2022
-
Tom Van Wouwe
Welcome to Dr. Tom Van Wouwe joining TML as a postdoctoral fellow!
March 23, 2022
-
New Postdoc Opportunity
TML is looking for outstanding individuals to join the team and collaborate in a number of research directions.
December 15, 2021
-
TML introduces a new open-source differentiable physics engine--NimblePhysics
A fast and feature-complete differentiable physics for articulated rigid bodies with contact.
May 01, 2021
-
New Course: Computer Graphics in the Era of AI
CS348I is a new graphics course to reinvent how we teach computer graphics in the era of machine learning or AI. It is co-taught by Prof. Karen Liu and Prof. Jiajun Wu.
September 07, 2020