Vehicle state estimation using vision and inertial measurements

Vishisht Gupta, Sean Brennan

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

A novel method for estimating vehicle roll, pitch and yaw using machine vision and inertial sensors is presented that is based on matching images captured from an on-vehicle camera to a rendered representation of the surrounding terrain obtained from an on-board map database. United States Geographical Survey Digital Elevation Maps (DEMs) were used to create a 3D topology map of the geography surrounding the vehicle, and it is assumed in this work that large segments of the surrounding terrain are visible, particularly the horizon lines. The horizon lines seen in the captured video from the vehicle are compared to the horizon lines obtained from a rendered geography, allowing absolute comparisons between rendered and actual scene in roll, pitch and yaw. A kinematic Kalman filter modeling an inertial navigation system then uses the scene matching to generate filtered estimates of orientation. Experiments using an instrumented vehicle operating at the test track of the Pennsylvania Transportation Institute were performed to check the validity of the method, and the results reveal a very close match between the vision-based estimates of orientation versus those from a high-quality GPS/INS system.

Original languageEnglish (US)
Title of host publication5th IFAC Symposium on Advances in Automotive Control, AAC 2007
PublisherIFAC Secretariat
Pages63-70
Number of pages8
EditionPART 1
ISBN (Print)9783902661265
DOIs
StatePublished - 2007

Publication series

NameIFAC Proceedings Volumes (IFAC-PapersOnline)
NumberPART 1
Volume5
ISSN (Print)1474-6670

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering

Fingerprint

Dive into the research topics of 'Vehicle state estimation using vision and inertial measurements'. Together they form a unique fingerprint.

Cite this