Sensor data fusion of LIDAR with stereo RGB-D camera for object tracking

Dieterle T, Particke F, Patino-Studencki L, Thielecke J (2017)


Publication Type: Conference contribution, Conference Contribution

Publication year: 2017

Pages Range: 1-3

Conference Proceedings Title: 2017 IEEE SENSORS

Event location: Glasgow GB

DOI: 10.1109/ICSENS.2017.8234267

Abstract

In Industry 4.0 scenarios, autonomously navigating robots will have to perform dedicated tasks in controlled environments, such as production halls or storage facilities. In the presence of pedestrians and other dynamic objects, robust collision detection is imperative in order to avoid harm of human or material. Supplementary sensors as part of the infrastructure may provide additional real-time overview. In this paper, a concept for dynamic object tracking by sensor data fusion, using a stationary stereo camera and a laser range finder on a mobile platform, is presented and analyzed. The proposed approach consists of two modules that involve frame-to-frame detection of targets, as well as subsequent data association, fusion and tracking. Object detection is carried out by 3D-processing techniques on point clouds. Data association for multi-target tracking is achieved, using the Joint Probabilistic Data Association Filter (JPDAF). Combination of sensor information is done by a hierarchical data fusion approach. Experiments show that this improves robustness to occlusions or sensor failure significantly.

Authors with CRIS profile

How to cite

APA:

Dieterle, T., Particke, F., Patino-Studencki, L., & Thielecke, J. (2017). Sensor data fusion of LIDAR with stereo RGB-D camera for object tracking. In 2017 IEEE SENSORS (pp. 1-3). Glasgow, GB.

MLA:

Dieterle, Thomas, et al. "Sensor data fusion of LIDAR with stereo RGB-D camera for object tracking." Proceedings of the SENSORS, 2017 IEEE, Glasgow 2017. 1-3.

BibTeX: Download