Direct feature correspondence in vision-aided inertial navigation for unmanned aerial vehicles

Federico Paredes Vallés, Daniel P. Magree, Eric Johnson

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper proposes a novel method for corresponding visual measurements to map points in a visual-inertial navigation system. The algorithm is based on the minimization of the photometric error on sparse locations of the image region, and realizes a gain in robustness that comes from the elimination of the need of feature-extraction for correspondence. The system is compared to a standard approach based on feature extraction, within a visual-inertial EKF formulation. High-fidelity simulation results show the proposed method improves the horizontal RMS error by means of increasing the number of features corresponded by the algorithm.

Original languageEnglish (US)
Title of host publication2017 International Conference on Unmanned Aircraft Systems, ICUAS 2017
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages221-229
Number of pages9
ISBN (Electronic)9781509044948
DOIs
StatePublished - Jul 25 2017
Event2017 International Conference on Unmanned Aircraft Systems, ICUAS 2017 - Miami, United States
Duration: Jun 13 2017Jun 16 2017

Publication series

Name2017 International Conference on Unmanned Aircraft Systems, ICUAS 2017

Other

Other2017 International Conference on Unmanned Aircraft Systems, ICUAS 2017
Country/TerritoryUnited States
CityMiami
Period6/13/176/16/17

All Science Journal Classification (ASJC) codes

  • Aerospace Engineering
  • Control and Optimization

Fingerprint

Dive into the research topics of 'Direct feature correspondence in vision-aided inertial navigation for unmanned aerial vehicles'. Together they form a unique fingerprint.

Cite this