Loading…

Image-based estimation, planning, and control for high-speed flying through multiple openings

This article focuses on enabling an aerial robot to fly through multiple openings at high speed using image-based estimation, planning, and control. State-of-the-art approaches assume that the robot’s global translational variables (e.g., position and velocity) can either be measured directly with e...

Full description

Saved in:
Bibliographic Details
Published in:The International journal of robotics research 2020-08, Vol.39 (9), p.1122-1137
Main Authors: Guo, Dejun, Leang, Kam K
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This article focuses on enabling an aerial robot to fly through multiple openings at high speed using image-based estimation, planning, and control. State-of-the-art approaches assume that the robot’s global translational variables (e.g., position and velocity) can either be measured directly with external localization sensors or estimated onboard. Unfortunately, estimating the translational variables may be impractical because modeling errors and sensor noise can lead to poor performance. Furthermore, monocular-camera-based pose estimation techniques typically require a model of the gap (window) in order to handle the unknown scale. Herein, a new scheme for image-based estimation, aggressive-maneuvering trajectory generation, and motion control is developed for multi-rotor aerial robots. The approach described does not rely on measurement of the translational variables and does not require the model of the gap or window. First, the robot dynamics are expressed in terms of the image features that are invariant to rotation (invariant features). This step decouples the robot’s attitude and keeps the invariant features in the flat output space of the differentially flat system. Second, an optimal trajectory is efficiently generated in real time to obtain the dynamically-feasible trajectory for the invariant features. Finally, a controller is designed to enable real-time, image-based tracking of the trajectory. The performance of the estimation, planning, and control scheme is validated in simulations and through 80 successful experimental trials. Results show the ability to successfully fly through two narrow openings, where the estimation and planning computation and motion control from one opening to the next are performed in real time on the robot.
ISSN:0278-3649
1741-3176
DOI:10.1177/0278364920921943