Heterogeneous multisensor fusion for mapping dynamic environments
In this paper, we propose a heterogeneous multisensor fusion algorithm for mapping in dynamic environments. The algorithm synergistically integrates the information obtained from an uncalibrated camera and sonar sensors to facilitate mapping and tracking. The sonar data is mainly used
to build a weighted line-based map via the fuzzy clustering technique. The line weight, with confidence corresponding to the moving object, is determined by both sonar and vision data. The motion tracking is primarily accomplished by vision data using particle filtering and the sonar
vectors originated from moving objects are used to modulate the sample weighting. A fuzzy system is implemented to fuse the two sensor data features. Additionally, in order to build a consistent global map and maintain reliable tracking of moving objects, the well-known extended Kalman filter
is applied to estimate the states of robot pose and map features. Thus, more robust performance in mapping as well as tracking are achieved. The empirical results carried out on the Pioneer 2DX mobile robot demonstrate that the proposed algorithm outperforms the methods a using homogeneous
sensor, in mapping as well as tracking behaviors.