Simultaneous localization and mapping for unmanned aerial vehicles
Most mobile robot applications require the robot to be able to localize itself in an unknown environment without prior information so that the robot can navigate and accomplish tasks. The robot must be able to build a map of the unknown environment while simultaneously localizing itself in this environment. The Simultaneous Localization and Mapping (SLAM) is the formulation of this problem which has drawn a considerable amount of interest in robotics research for the past two decades. This work focuses on the SLAM problem for single and multiple agents equipped with vision sensors. We develop a vision-based 2-D SLAM algorithm for single and multiple Unmanned Aerial Vehicles (UAV) flying at constant altitude. Using the features of images obtained from an on-board camera to identify different landmarks, we apply different approaches based on the Extended Kalman Filter (EKF), the Information Filter (IF) and the Particle Filter (PF) to the SLAM problem. We present some simulation results and provide a comparison between the different implementations. We find Particle Filter implementations to perform better in estimations when compared to EKF and IF, however EKF and IF present more consistent results.