Dynamically sized occupancy grids for obstacle avoidance

Sean Quinn Marlow, Jack W. Langelaan

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Scopus citations

Abstract

This paper presents a method for navigation of a small unmanned rotor craft through an unsurveyed environment using a single camera and an inertial measurement unit corrected by GPS. Current missions for small unmanned aerial vehicles involve low altitude flights in complex environments (e.g. urban canyons and forests) in close proximity to obstacles. Successful navigation with no a priori knowledge can be accomplished if obstacle locations can be estimated. The algorithm presented here uses measurements of pixel location and optical flow to compute estimates of obstacle location. These estimates are used to populate a local occupancy grid fixed to the vehicle; however, the grid cells map the time to impact of an obstacle rather than physical distance. This time-based mapping allows high spatial resolution during slow flight (e.g. during approach and landing). The local occupancy grid facilitates the modeling of complex environments and is suitable for use by generic trajectory planners. Results of both two-dimensional and three-dimensional simulations are presented using a potential field method for obstacle avoidance and navigation.

Original languageEnglish (US)
Title of host publicationAIAA Guidance, Navigation, and Control Conference
DOIs
StatePublished - 2010
EventAIAA Guidance, Navigation, and Control Conference - Toronto, ON, Canada
Duration: Aug 2 2010Aug 5 2010

Publication series

NameAIAA Guidance, Navigation, and Control Conference

Other

OtherAIAA Guidance, Navigation, and Control Conference
Country/TerritoryCanada
CityToronto, ON
Period8/2/108/5/10

All Science Journal Classification (ASJC) codes

  • Aerospace Engineering
  • Control and Systems Engineering

Fingerprint

Dive into the research topics of 'Dynamically sized occupancy grids for obstacle avoidance'. Together they form a unique fingerprint.

Cite this