Foto 7

V.S.Raghavan, D.Kanoulas, C.Zhou, D.Caldwell, and N.Tsagarakis, "A Study on Low-Drift State Estimation for Humanoid Locomotion, using LiDAR and Kinematic-Inertial Data Fusion", IEEE-RAS International Conference on Humanoid Robots, November 2018

Written by

Several humanoid robots will require to navigate in unsafe and unstructured environments, such as those after a disaster, for human assistance and support.  To achieve this, humanoids require to construct in real-time, accurate maps of the environment and localize in it by estimating their base/pelvis state without any drift, using computationally efficient mapping and state estimation algorithms.  While a multitude of Simultaneous Localization and Mapping (SLAM) algorithms exist, their localization relies on the existence of repeatable landmarks, which might not always be available in unstructured environments. Several studies also use stop-and-map procedures to map the environment before traversal, but this is not ideal for scenarios where the robot needs to be continuously moving to keep for instance the task completion time short.  In this paper, we present a novel combination of the state-of-the-art odometry and mapping based on LiDAR data and state estimation based on the kinematics-inertial data of the humanoid.  We present experimental evaluation of the introduced state estimation on the full-size humanoid robot WALK-MAN while performing locomotion tasks. Through this combination, we prove that it is possible to obtain low-error, high frequency estimates of the state of the robot, while moving and mapping the environment on the go.