Vision and inertial sensor fusion for terrain relative navigation

Andrew Verras, Roshan T. Eapen, Andrew B. Simon, Manoranjan Majji, Ramchander Rao Bhaskara, Carolina I. Restrepo, Ronney Lovelace

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Scopus citations


Mathematics and methods of integrating camera measurements with inertial sensors for terrain relative navigation of a space vehicle are discussed. Pinhole camera model of the vision sensors, in conjunction with measurement models of typical inertial sensors are used to derive a position and attitude fix for the navigation state of the space vehicle. An ancillary frame initialization process that exploits the three dimensional translational motion geometry of the space vehicle to derive uncertain estimates of the feature locations is derived. Linear covariance analysis is carried out to derive the conditional state uncertainties of the feature locations that are utilized by the filter in a second pass. Approaches for state estimation are tested using data obtained from a high-fidelity rendering engine developed by the team. Experimental data obtained from a medium-fidelity terrain relative navigation emulation test-bed called Navigation, Estimation, and Sensing Testbed (NEST) is utilized to demonstrate the utility of the filter formulations developed here-in.

Original languageEnglish (US)
Title of host publicationAIAA Scitech 2021 Forum
PublisherAmerican Institute of Aeronautics and Astronautics Inc, AIAA
Number of pages21
ISBN (Print)9781624106095
StatePublished - 2021
EventAIAA Science and Technology Forum and Exposition, AIAA SciTech Forum 2021 - Virtual, Online
Duration: Jan 11 2021Jan 15 2021

Publication series

NameAIAA Scitech 2021 Forum


ConferenceAIAA Science and Technology Forum and Exposition, AIAA SciTech Forum 2021
CityVirtual, Online

All Science Journal Classification (ASJC) codes

  • Aerospace Engineering


Dive into the research topics of 'Vision and inertial sensor fusion for terrain relative navigation'. Together they form a unique fingerprint.

Cite this