Loading…

A biologically inspired navigation concept based on the Landmark-Tree map for efficient long-distance robot navigation

Map-based navigation is a crucial task for any mobile robot. Usually, in an unknown environment, this problem is addressed by applying Simultaneous Localization and Mapping based on metric grid-maps. However, such maps are in general rather computational expensive and do not scale well. Insects are...

Full description

Saved in:
Bibliographic Details
Published in:Advanced robotics 2014-03, Vol.28 (5), p.289-302
Main Authors: Mair, Elmar, Augustine, Marcus, Jäger, Bastian, Stelzer, Annett, Brand, Christoph, Burschka, Darius, Suppa, Michael
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Map-based navigation is a crucial task for any mobile robot. Usually, in an unknown environment, this problem is addressed by applying Simultaneous Localization and Mapping based on metric grid-maps. However, such maps are in general rather computational expensive and do not scale well. Insects are able to cover large distances and reliably find back to their nests, although they are quite limited in their resources. Inspired by theories on insect navigation, we developed a data structure which is highly scalable and efficiently adapts to the available memory during run-time. Positions in space are memorized as snapshots, which are unique configurations of landmarks. Unlike conventional snapshot or visual map approaches, we do not simply store the landmarks as a set, but we arrange them in a tree-like structure according to the relevance of their information. The resulting navigation solely relies on the direction measurements of arbitrary landmarks. In this work, we present the concept of the Landmark-Tree (LT) map and apply it to a mobile platform equipped with an omnidirectional camera. We verify the reliability and robustness of the LT-map concept in simulations as well as by experiments with the robotic platform.
ISSN:0169-1864
1568-5535
DOI:10.1080/01691864.2013.871770