RI: Small: Collaborative Research: Vision-guided Control of Robust Perching: From Biological to Robotic Flyers

Project: Research project

Project Details

Description

In a blink of our eyes, a fly can perch upside-down on a ceiling by executing a sequence of well-coordinated maneuvers triggered and controlled by a brain no bigger than a pinhead. This project aims to unravel how the robust intelligence of this process emerges from a synergistic combination of computational and mechanical processes, and how can they be translated to robotic flyers through an integrated biological and robotic investigation. Small robotic flyers suffer universally from high energy consumption and low aerodynamic efficiency. Robust perching will significantly extend small robotic flyer's functioning time and expand their applications such as environmental and disaster monitoring, aerial surveillance, and search and rescue. The project will also introduce biological and robotic locomotion principles to students and the public through visually appealing insect flights and robotic perching experiments, creating an ideal educational framework for STEM students of all ages.

For small flyers, animals and robots alike, orchestrating a successful perching in dynamic and uncertain environments is one of the most challenging aerobatic feats. To land safely, a flyer has to solve a distance-velocity-attitude sensing and control problem with limited computational resources and under stringent time constraints. This project will first use targeted insect experiments with artificially-manipulated visual cues to reveal the computational processes that coordinate different maneuvers of perching in blue bottle flies, and how the computation takes into account dynamic physical environments and mechanical processes. Guided by the results from the biological perching, this project will accomplish robust robotic perching on stationary or moving surfaces using a centimeter-scale quadcopter. This will be achieved by developing computationally-efficient vision algorithms to extract essential visual information to control the robot's velocity and orientation prior to touchdown so that the robot can firmly attach to the desired surfaces with biomimetic compliant legs.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

StatusFinished
Effective start/end date8/15/187/31/21

Funding

  • National Science Foundation: $250,000.00

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.