TY - GEN
T1 - Vision-Based Localization and Autonomous Homing for UAVs
AU - Perumalla, Aniruddha
AU - Khamvilai, Thanakorn
AU - Johnson, Eric
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - In GPS-denied environments, using vision-based techniques to perform simultaneous localization and mapping (SLAM) can be beneficial, even in cases where maneuvers to precise locations are still necessary. In this work, we explore monocular vision-based SLAM as the basis for a guidance method for an unmanned aerial vehicle (UAV) to perform a 'homing' maneuver towards location(s) in the flight environment that may be decided 'on-the-go' during flight. The estimation uses a Harris corner algorithm that generates 'feature points' in images from a monocular camera. An Extended Kalman Filter (EKF) fuses these feature points at each instant with measurements from an IMU. This fusion is used for SLAM, i.e., to both estimate the state of the ownship and maintain a database of estimated 'world points' (features) in the flight environment. The results of this localization are used for the homing guidance framework. The vision-based estimation framework was implemented first in simulation. The full vision-based estimation and homing framework was then validated in an indoor flight test using a small UAV with an onboard monocular camera. This flight test demonstrates that the vision-based framework can guide a UAV and operator to complete a semi-automated homing maneuver towards an selected object in an unknown environment.
AB - In GPS-denied environments, using vision-based techniques to perform simultaneous localization and mapping (SLAM) can be beneficial, even in cases where maneuvers to precise locations are still necessary. In this work, we explore monocular vision-based SLAM as the basis for a guidance method for an unmanned aerial vehicle (UAV) to perform a 'homing' maneuver towards location(s) in the flight environment that may be decided 'on-the-go' during flight. The estimation uses a Harris corner algorithm that generates 'feature points' in images from a monocular camera. An Extended Kalman Filter (EKF) fuses these feature points at each instant with measurements from an IMU. This fusion is used for SLAM, i.e., to both estimate the state of the ownship and maintain a database of estimated 'world points' (features) in the flight environment. The results of this localization are used for the homing guidance framework. The vision-based estimation framework was implemented first in simulation. The full vision-based estimation and homing framework was then validated in an indoor flight test using a small UAV with an onboard monocular camera. This flight test demonstrates that the vision-based framework can guide a UAV and operator to complete a semi-automated homing maneuver towards an selected object in an unknown environment.
UR - http://www.scopus.com/inward/record.url?scp=85211233106&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85211233106&partnerID=8YFLogxK
U2 - 10.1109/DASC62030.2024.10749484
DO - 10.1109/DASC62030.2024.10749484
M3 - Conference contribution
AN - SCOPUS:85211233106
T3 - AIAA/IEEE Digital Avionics Systems Conference - Proceedings
BT - DASC 2024 - Digital Avionics Systems Conference, Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 43rd AIAA DATC/IEEE Digital Avionics Systems Conference, DASC 2024
Y2 - 29 September 2024 through 3 October 2024
ER -