Estimation Techniques in Robust Vision-Based Landing of Aerial Vehicles

Takuma Nakamura, Daniel Magree, Eric N. Johnson

Research output: Contribution to journalArticlepeer-review

6 Scopus citations


This paper describes recent advances in autonomous visual landing of aircraft in three operational scenarios. The first suggests a robust visual target and an algorithm that tracks the suggested target. The second explores the case when we can not use a prepared visual target and have to land on an arbitrary target. Both the first and second methods are evaluated with a mobile target. The third addresses the problem of landing on an unprepared static target in GPS-denied environments. A key thread throughout all approaches is the estimation of not only of system states, but also of error covariance of the target and vehicle. The error covariance then may be used to determine the status of the estimation during the approach and to engage a contingency maneuver if necessary. The approaches are validated in high-fidelity simulation and in flight testing. Landing pad tracking is shown to be accurate and robust to viewpoint and distance. GPS-denied landing is found to have low error and be robust to landing zone appearance.

Original languageEnglish (US)
Pages (from-to)11664-11669
Number of pages6
Journal20th IFAC World Congress
Issue number1
StatePublished - Jul 2017

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering


Dive into the research topics of 'Estimation Techniques in Robust Vision-Based Landing of Aerial Vehicles'. Together they form a unique fingerprint.

Cite this