TY - GEN
T1 - Relative motion estimation for vision-based formation flight using unscented kalman filter
AU - Oh, Seung Min
AU - Johnson, Eric N.
PY - 2007
Y1 - 2007
N2 - This paper describes a vision-based relative motion estimator in the formation flight of two unmanned aerial vehicles (UAVs). The navigation of a follower aircraft relative to a target (or leader) aircraft is performed by vision-only information from a single camera fixed to the follower aircraft. The images of the target aircraft projected on the videocamera plane of the follower aircraft are captured and processed into vision information. The vision information for the relative motion estimator in this work is composed of three target angles: an azimuth angle, an elevation angle, and a subtended angle. Using this vision information measurement, the follower aircraft estimates a target relative position, a target relative velocity, a target size, and target acceleration components in the framework of an unscented Kalman filter (UKF). The UKF is applied to the relative motion estimator due to the highly nonlinear characteristics of the problem at hand. In order to evaluate the performance of the vision-based navigation filter, vision information obtained through a real-time flight test is post-processed by the filter. The real-time vision information, obtained by a geometric active contour method on an onboard computer, is composed of three points of the target aircraft on the camera image plane of the follower aircraft. These target points are the center point, the left wingtip point, and the right wingtip point. The target center point on the image plane provides information about the azimuth and the elevation angles of the target in the camera frame, and the two wingtip points provide information about the subtended angle of the target, which ultimately provides the target size. The vision-based estimation results of the target-follower relative motion and target characteristics are compared to actual data that are independently obtained from the onboard integrated navigation systems of both aircraft during the flight test. Each integrated navigation system is composed of an inertial measurement unit (IMU), a global positioning system (GPS), and a magnetometer. Comparisons indicate that the vision-based estimation filter produces satisfactory estimation results and thus successfully overcomes the highly nonlinear system characteristics by the UKF framework.
AB - This paper describes a vision-based relative motion estimator in the formation flight of two unmanned aerial vehicles (UAVs). The navigation of a follower aircraft relative to a target (or leader) aircraft is performed by vision-only information from a single camera fixed to the follower aircraft. The images of the target aircraft projected on the videocamera plane of the follower aircraft are captured and processed into vision information. The vision information for the relative motion estimator in this work is composed of three target angles: an azimuth angle, an elevation angle, and a subtended angle. Using this vision information measurement, the follower aircraft estimates a target relative position, a target relative velocity, a target size, and target acceleration components in the framework of an unscented Kalman filter (UKF). The UKF is applied to the relative motion estimator due to the highly nonlinear characteristics of the problem at hand. In order to evaluate the performance of the vision-based navigation filter, vision information obtained through a real-time flight test is post-processed by the filter. The real-time vision information, obtained by a geometric active contour method on an onboard computer, is composed of three points of the target aircraft on the camera image plane of the follower aircraft. These target points are the center point, the left wingtip point, and the right wingtip point. The target center point on the image plane provides information about the azimuth and the elevation angles of the target in the camera frame, and the two wingtip points provide information about the subtended angle of the target, which ultimately provides the target size. The vision-based estimation results of the target-follower relative motion and target characteristics are compared to actual data that are independently obtained from the onboard integrated navigation systems of both aircraft during the flight test. Each integrated navigation system is composed of an inertial measurement unit (IMU), a global positioning system (GPS), and a magnetometer. Comparisons indicate that the vision-based estimation filter produces satisfactory estimation results and thus successfully overcomes the highly nonlinear system characteristics by the UKF framework.
UR - http://www.scopus.com/inward/record.url?scp=37249059798&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=37249059798&partnerID=8YFLogxK
U2 - 10.2514/6.2007-6866
DO - 10.2514/6.2007-6866
M3 - Conference contribution
AN - SCOPUS:37249059798
SN - 1563479044
SN - 9781563479045
T3 - Collection of Technical Papers - AIAA Guidance, Navigation, and Control Conference 2007
SP - 5365
EP - 5381
BT - Collection of Technical Papers - AIAA Guidance, Navigation, and Control Conference 2007
PB - American Institute of Aeronautics and Astronautics Inc.
T2 - AIAA Guidance, Navigation, and Control Conference 2007
Y2 - 20 August 2007 through 23 August 2007
ER -