Loading…

GateNet: An Efficient Deep Neural Network Architecture for Gate Perception Using Fish-Eye Camera in Autonomous Drone Racing

Fast and robust gate perception is of great importance in autonomous drone racing. We propose a convolutional neural network-based gate detector (GateNet 1 ) that concurrently detects gate's center, distance, and orientation with respect to the drone using only images from a single fish-eye RGB...

Full description

Saved in:
Bibliographic Details
Main Authors: Pham, Huy Xuan, Bozcan, Ilker, Sarabakha, Andriy, Haddadin, Sami, Kayacan, Erdal
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Fast and robust gate perception is of great importance in autonomous drone racing. We propose a convolutional neural network-based gate detector (GateNet 1 ) that concurrently detects gate's center, distance, and orientation with respect to the drone using only images from a single fish-eye RGB camera. GateNet achieves a high inference rate (up to 60 Hz) on an onboard processor (Jetson TX2). Moreover, GateNet is robust to gate pose changes and background disturbances. The proposed perception pipeline leverages a fish-eye lens with a wide field-of-view and thus can detect multiple gates in close range, allowing a longer planning horizon even in tight environments. For benchmarking, we propose a comprehensive dataset (AU-DR) that focuses on gate perception. Throughout the experiments, GateNet shows its superiority when compared to similar methods while being efficient for onboard computers in autonomous drone racing. The effectiveness of the proposed framework is tested on a fully-autonomous drone that flies on previously-unknown track with tight turns and varying gate positions and orientations in each lap.
ISSN:2153-0866
DOI:10.1109/IROS51168.2021.9636207