Multi-modal sensor fusion for indoor mobile robot pose estimation

Dobrev Y, Flores SA, Vossiek M (2016)


Publication Language: English

Publication Status: Published

Publication Type: Conference contribution, Conference Contribution

Publication year: 2016

Publisher: Institute of Electrical and Electronics Engineers Inc.

Pages Range: 553-556

Article Number: 7479745

ISBN: 9781509020423

DOI: 10.1109/PLANS.2016.7479745

Abstract

While global navigation satellite systems (GNSS) are the state of the art for localization, in general they are unable to operate inside buildings, and there is currently no well-established solution for indoor localization. In this paper we propose a 3D mobile robot pose (2D position and 1D orientation) estimation system for indoor applications. The system is based on the cooperative sensor fusion of radar, ultrasonic and odometry data using an extended Kalman filter (EKF). A prerequisite for the EKF is an occupancy grid map of the scenario as well as the pose of the reference radar node inside the map. Our system can handle even the kidnapped-robot case as the radar provides absolute localization. We conducted a series of measurements in an office building corridor. We determined the typical position root-mean square error (RMSE) to be less than 15 cm.

Authors with CRIS profile

How to cite

APA:

Dobrev, Y., Flores, S.A., & Vossiek, M. (2016). Multi-modal sensor fusion for indoor mobile robot pose estimation. In Proceedings of the IEEE/ION Position, Location and Navigation Symposium, PLANS 2016 (pp. 553-556). Institute of Electrical and Electronics Engineers Inc..

MLA:

Dobrev, Yassen, Sergio Alberto Flores, and Martin Vossiek. "Multi-modal sensor fusion for indoor mobile robot pose estimation." Proceedings of the IEEE/ION Position, Location and Navigation Symposium, PLANS 2016 Institute of Electrical and Electronics Engineers Inc., 2016. 553-556.

BibTeX: Download