Date of Award
Summer 8-2016
Access Type
Thesis - Open Access
Degree Name
Master of Science in Unmanned and Autonomous Systems Engineering
Department
Electrical, Computer, Software, and Systems Engineering
Committee Chair
Richard J. Prazenica
First Committee Member
Hever Moncayo
Second Committee Member
Troy Henderson
Abstract
This thesis discusses the evaluation, implementation, and testing of several navigation algorithms and feature extraction algorithms using an inertial measurement unit (IMU) and an image capture device (camera) mounted on a ground robot and a quadrotor UAV. The vision-aided navigation algorithms are implemented on data-collected from sensors on an unmanned ground vehicle and a quadrotor, and the results are validated by comparison with GPS data. The thesis investigates sensor fusion techniques for integrating measured IMU data with information extracted from image processing algorithms in order to provide accurate vehicle state estimation. This image-based information takes the forms of features, such as corners, that are tracked over multiple image frames. An extended Kalman filter (EKF) in implemented to fuse vision and IMU data. The main goal of the work is to provide navigation of mobile robots in GPS-denied environments such as indoor environments, cluttered urban environments, or space environments such as asteroids, other planets or the moon. The experimental results show that combining pose information extracted from IMU readings along with pose information extracted from a vision-based algorithm managed to solve the drift problem that comes from using IMU alone and the scale problem that comes from using a monocular vision-based algorithm alone.
Scholarly Commons Citation
Sayem, Ahmed Saber Soliman, "Vision-Aided Navigation for Autonomous Vehicles Using Tracked Feature Points" (2016). Doctoral Dissertations and Master's Theses. 240.
https://commons.erau.edu/edt/240