Browsing by Subject "Localization theory."
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item Open Access Improving visual SLAM by filtering outliers with the aid of optical flow(2011) Özaslan, TolgaSimultaneous Localization and Mapping (SLAM) for mobile robots has been one of the challenging problems for the robotics community. Extensive study of this problem in recent years has somewhat saturated the theoretical and practical background on this topic. Within last few years, researches on SLAM have been headed towards Visual SLAM, in which camera is used as the primary sensor. Superior to many SLAM application run with planar robots, VSLAM allows us to estimate the 3D model of the environment and 6-DOF pose of the robot. Being applied to robotics only recently, VSLAM still has a lot of room for improvement. In particular, a common issue both in normal and Visual SLAM algorithms is the data association problem. Wrong data association either disturbs stability or result in divergence of the SLAM process. In this study, we propose two outlier elimination methods which use predicted feature location error and optical flow field. The former method asserts estimated landmark projection and its measurement locations to be close. The latter accepts optical flow field as a reference and compares the vector formed by consecutive matched feature locations; eliminates matches contradicting with the local optical flow vector field. We have shown these two methods to be saving VSLAM from divergence and improving its overall performance. We have also described our new modular SLAM library, SLAM++.Item Open Access Using shape information from natural tree landmarks for improving SLAM performance(2012) Turan, BilalLocalization and mapping are crucial components for robotic autonomy. However, such robots must often function in remote, outdoor areas with no a-priori knowledge of the environment. Consequently, it becomes necessary for field robots to be able to construct their own maps based on exteroceptive sensor readings. To this end, visual sensing and mapping through naturally occurring landmarks have distinct advantages. With the availability of high bandwidth data provided by visual sensors, meaningful and uniquely identifiable objects can be detected. This improves the construction of maps consisting of natural landmarks that are meaningful for human readers as well. In this thesis, we focus on the use of trees in an outdoor environment as a suitable set of landmarks for Simultaneous Localization and Mapping (SLAM). Trees have a relatively simple, near vertical structure which makes them easily and consistently detectable. Furthermore, the thickness of a tree can be accurately determined from different viewpoints. Our primary contribution is the usage of the width of a tree trunk as an additional sensory reading, allowing us to include the radius of tree trunks on the map. To this end, we introduce a new sensor model that relates the width of a tree landmark on the image plane to the radius of its trunk. We provide a mathematical formulation of this model, derive associated Jacobians and incorporate our sensor model into a working EKF SLAM implementation. Through simulations we show that the use of this new sensory reading improves the accuracy of both the map and the trajectory estimates without additional sensor hardware other than a monocular camera.