Multiple-hypothesis vision-based landing autonomy

Takuma Nakamura, Eric N. Johnson

Research output: Contribution to journalArticlepeer-review

Abstract

This paper presents a novel state estimation system for unmanned aerial vehicle landing. A novel vision algorithm that detects a portion of the marker is developed, and this algorithm extends the detectable range of the vision system for any known marker. A vision-aided navigation algorithm is derived within extended Kalman particle filter and Rao–Blackwellized particle filter frameworks in addition to a standard extended Kalman filter framework. These multihypothesis approaches not only deal well with a highly nonlinear and non-Gaussian distribution of the measurement errors of vision but also result in numerically stable filters. The computational costs are reduced compared to a naive implementation of particle filter, and these algorithms run in real time. This system is validated through numerical simulation, image-in-the-loop simulation, and flight tests.

Original languageEnglish (US)
Pages (from-to)528-545
Number of pages18
JournalJournal of Aerospace Information Systems
Volume17
Issue number9
DOIs
StatePublished - 2020

All Science Journal Classification (ASJC) codes

  • Aerospace Engineering
  • Computer Science Applications
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Multiple-hypothesis vision-based landing autonomy'. Together they form a unique fingerprint.

Cite this