Skip to main content Skip to secondary navigation
Journal Article

DART: Dynamic Animation and Robotics Toolkit

DART (Dynamic Animation and Robotics Toolkit) is a collaborative, cross-platform, open source library created by the Graphics Lab and Humanoid Robotics Lab at Georgia In- stitute of Technology with ongoing contributions from the Personal Robotics Lab at Uni- versity of Washington and Open Source Robotics Foundation. The library provides data structures and algorithms for kinematic and dynamic applications in robotics and computer animation. DART is distinguished by its accuracy and stability due to its use of generalized coordinates to represent articulated rigid body systems in the geometric notations and Featherstone’s Articulated Body Algorithm using a Lie group formulation to compute forward dynamics and hybrid dynamics. For developers, in contrast to many popular physics engines which view the simulator as a black box, DART gives full access to internal kinematic and dynamic quantities, such as the mass matrix, Coriolis and centrifugal forces, transformation matrices and their derivatives. DART also provides an efficient computation of Jacobian matrices for arbitrary body points and coordinate frames. The frame semantics of DART allows users to define arbitrary refer- ence frames (both inertial and non-inertial) and use those frames to specify or request data. For air-tight code safety, forward kinematics and dynamics values are updated automatically through lazy evaluation, making DART suitable for real-time controllers. In addition, DART provides flexibility to extend the API for embedding user-provided classes into DART data structures. Contacts and collisions are handled using an implicit time-stepping, velocity-based LCP (linear complementarity problem) to guarantee non-penetration, directional friction, and approximated Coulomb friction cone conditions. DART has applications in robotics and computer animation because it features a multibody dynamic simulator and various kinematic tools for control and motion planning.

Paper    Code

Author(s)
Jeongseok Lee
Michael X Grey
Sehoon Ha
Tobias Kunz
Sumit Jain
Yuting Ye
Siddhartha S Srinivasa
Mike Stillman
C. Karen Liu
Publication Date
January, 2018