rtylecek
Thu 17 Jan 2019, 12:45 - 14:00
IF, 4.31/33

If you have a question about this talk, please contact: Jodie Cameron (jcamero9)

Multiple Sensor Fusion for Robust Visual-Inertial Odometry

While pose estimation with visual SLAM can be highly accurate, it is not guaranteed to provide the smooth pose estimate that navigation algorithms expect. For this reason it is desirable to include a filter that can use other sensors mounted on the vehicle and motion model to constrain the estimated trajectory of the vehicle to be smooth.

The platform for autonomous navigation in gardens developed for TrimBot project has 10 cameras, which provide input to generalised camera SLAM, includes wheel rotation encoders and has IMU unit mounted.

We will show how these three sensor types can be fused using Extended Kalman Filter approach, with simple but efficient estimation of SLAM measurement covariance.

The results will be demonstrated

on challenging outdoor sequences and put in the wider context of 3D reconstruction pipeline integrated on the robot platform.