We are actively developing and addressing issues with the estimation framework. Contributions, feedback, or suggestions are highly appreciated as we continue to enhance this project!
SEROW (State Estimation RObot Walking) facilitates legged robot state estimation. Designed as a versatile tool, SEROW offers a generalized estimation solution applicable to legged robots with N limbs, accommodating both point and flat feet configurations. Notably, the framework's codebase is openly accessible under the GNU GPLv3 License.
A legged robot (humanoid, quadruped, centaur) walks on the ground. Unlike a wheeled robot that can use wheel encoders to know where it is, a legged robot's feet repeatedly lift off and touch down. The robot needs to answer a few fundamental questions in real-time:
- Where am I? (base position and orientation in the world)
- How fast am I moving? (base linear/angular velocity)
- Where is my Center of Mass (CoM)? (critical for balance)
- How fast is my CoM moving? (critical for balance)
- Are there any external forces other than the Ground Reaction Forces (GRFs) acting on me? (critical for manipulation)
The robot's on-board sensors -- an IMU (accelerometer + gyroscope), joint encoders, and foot force/torque sensors -- are all noisy and incomplete. No single sensor can give the full answer. SEROW fuses all of them using principled probabilistic estimation to produce a coherent, real-time and low-drift state estimate.
- Mounted on the robot's torso ("base frame").
- Measures linear acceleration (including gravity) and angular velocity.
- Very fast (~100-2000 Hz) but drifts over time -- integrating acceleration to get velocity accumulates error quickly.
- SEROW models and continuously estimates the gyroscope bias and accelerometer bias to compensate for drift.
- Each motor joint reports its angular position (radians).
- Combined with a kinematic model (URDF/MJCF), tells us where each foot is relative to the base frame.
- Does NOT directly tell us where the base is in the world.
- Mounted at each foot (or ankle).
- Measure the Ground Reaction Force (GRF) and optionally torque.
- Used for two things: (a) detecting whether a foot is in contact with the ground, and (b) computing the Center of Pressure (COP).
- A camera-based (Visual Odometry) or LiDAR-based (LiDAR Odometry) system providing base position/orientation.
- Used as an additional correction in the EKF if available.
- Can be subject to outlier measurements in degraded environments. SEROW automatically accounts for that and rejects these outliers.
SEROW is structured as a pipeline of estimators that run sequentially each time a new set of sensor readings arrives.
| Cogimon and SEROW | Centauro and SEROW |
|---|---|
|
|
- Non-linear ZMP based State Estimation for Humanoid Robot Locomotion, https://ieeexplore.ieee.org/document/7803278 (Humanoids 2016 - nominated for the best interactive paper award)
- Nonlinear State Estimation for Humanoid Robot Walking, https://ieeexplore.ieee.org/document/8403285 (RA-L + IROS 2018)
- Outlier-Robust State Estimation for Humanoid Robots, https://ieeexplore.ieee.org/document/8968152 (IROS 2019)
- https://www.youtube.com/watch?v=nkzqNhf3_F4
- https://www.youtube.com/watch?v=9OvIBg8tn54
- https://www.youtube.com/watch?v=ojogeY3xSsw
These instructions will get you a copy of the project up and running on your local machine for testing purposes.
Define the environment variable inside your .bashrc file:
export SEROW_PATH=<path-to-serow-package>
- Eigen 3.4.0 and later
- Pinocchio 3.0.0 and later
- json
- flatbuffers
- cmake 3.16.3 and later
- gcc 9.4.0 and later
mkdir build && cd buildcmake .. && make -j4sudo make install
cd test && mkdir build && cd buildcmake .. && make -j4./nao_test
- Logs are saved under
/tmp - Run Foxglove
- Load the data
- Import
foxglove_layout.json
Upon usage in an academic work kindly cite:
@ARTICLE{PiperakisRAL18,
author={S. {Piperakis} and M. {Koskinopoulou} and P. {Trahanias}},
journal={IEEE Robotics and Automation Letters},
title={{Nonlinear State Estimation for Humanoid Robot Walking}},
year={2018},
volume={3},
number={4},
pages={3347-3354},
doi={10.1109/LRA.2018.2852788},
month={Oct},
}

