Loading…

WeCo-SLAM: Wearable Cooperative SLAM System for Real-Time Indoor Localization Under Challenging Conditions

Real-time globally consistent GPS tracking is critical for an accurate localization and is crucial for applications such as autonomous navigation or multi-robot mapping. However, under challenging environment conditions such as indoor/outdoor transitions, GPS signals are partially available or not c...

Full description

Saved in:
Bibliographic Details
Published in:IEEE sensors journal 2022-03, Vol.22 (6), p.5122-5132
Main Authors: Kachurka, Viachaslau, Rault, Bastien, Ireta Munoz, Fernando I., Roussel, David, Bonardi, Fabien, Didier, Jean-Yves, Hadj-Abdelkader, Hicham, Bouchafa, Samia, Alliez, Pierre, Robin, Maxime
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c370t-81605cf858ec98b6003aaa5f157ca75700f6661464dd4f89327f80061ea4d3803
cites cdi_FETCH-LOGICAL-c370t-81605cf858ec98b6003aaa5f157ca75700f6661464dd4f89327f80061ea4d3803
container_end_page 5132
container_issue 6
container_start_page 5122
container_title IEEE sensors journal
container_volume 22
creator Kachurka, Viachaslau
Rault, Bastien
Ireta Munoz, Fernando I.
Roussel, David
Bonardi, Fabien
Didier, Jean-Yves
Hadj-Abdelkader, Hicham
Bouchafa, Samia
Alliez, Pierre
Robin, Maxime
description Real-time globally consistent GPS tracking is critical for an accurate localization and is crucial for applications such as autonomous navigation or multi-robot mapping. However, under challenging environment conditions such as indoor/outdoor transitions, GPS signals are partially available or not consistent over time. In this paper, a real-time tracking system for continuously locating emergency response agents in challenging conditions is presented. A cooperative localization method based on Laser-Visual-Inertial (LVI) and GPS sensors is achieved by communicating optimization events between a LiDAR-Inertial-SLAM (LI-SLAM) and Visual-Inertial-SLAM (VI-SLAM) that operate simultaneously. The estimation of the pose assisted by multiple SLAM approaches provides the GPS localization of the agent when a stand-alone GPS fails. The system has been tested under the terms of the MALIN Challenge, which aims to globally localize agents across outdoor and indoor environments under challenging conditions (such as smoked rooms, stairs, indoor/outdoor transitions, repetitive patterns, extreme lighting changes) where it is well known that a stand-alone SLAM will not be enough to maintaining the localization. The system achieved Absolute Trajectory Error of 0.48%, with a pose update rate between 15 and 20 Hz. Furthermore, the system is able to build a global consistent 3D LiDAR Map that is post-processed to create a 3D reconstruction at different level of details.
doi_str_mv 10.1109/JSEN.2021.3101121
format article
fullrecord <record><control><sourceid>proquest_hal_p</sourceid><recordid>TN_cdi_proquest_journals_2639936961</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9500208</ieee_id><sourcerecordid>2639936961</sourcerecordid><originalsourceid>FETCH-LOGICAL-c370t-81605cf858ec98b6003aaa5f157ca75700f6661464dd4f89327f80061ea4d3803</originalsourceid><addsrcrecordid>eNo9kEFLw0AQhYMoWKs_QLwsePKQOpNNNrveSqhWiQq2pd6WbTKpKWm2blJBf70JFU8zzPvm8Xied4kwQgR1-zSbvIwCCHDEERADPPIGGEXSxziUx_3OwQ95_H7qnTXNBgBVHMUDb7OkxPqzdPx8x5ZknFlVxBJrd-RMW34R6yU2-25a2rLCOvZGpvLn5ZbYY53b7pDazFTlT0fbmi3qnBxLPkxVUb0u63XnVedlrzXn3klhqoYu_ubQW9xP5snUT18fHpNx6mc8htaXKCDKChlJypRcCQBujIkKjOLMdJkBCiEEhiLM87CQigdxIQEEkglzLoEPvZuDb5dC71y5Ne5bW1Pq6TjV_Q24ABXG-IUde31gd85-7qlp9cbuXd3F04HgSnGhRE_hgcqcbRpHxb8tgu7r1339uq9f_9Xf_Vwdfkoi-udVBBCA5L8NmH5h</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2639936961</pqid></control><display><type>article</type><title>WeCo-SLAM: Wearable Cooperative SLAM System for Real-Time Indoor Localization Under Challenging Conditions</title><source>IEEE Electronic Library (IEL) Journals</source><creator>Kachurka, Viachaslau ; Rault, Bastien ; Ireta Munoz, Fernando I. ; Roussel, David ; Bonardi, Fabien ; Didier, Jean-Yves ; Hadj-Abdelkader, Hicham ; Bouchafa, Samia ; Alliez, Pierre ; Robin, Maxime</creator><creatorcontrib>Kachurka, Viachaslau ; Rault, Bastien ; Ireta Munoz, Fernando I. ; Roussel, David ; Bonardi, Fabien ; Didier, Jean-Yves ; Hadj-Abdelkader, Hicham ; Bouchafa, Samia ; Alliez, Pierre ; Robin, Maxime</creatorcontrib><description>Real-time globally consistent GPS tracking is critical for an accurate localization and is crucial for applications such as autonomous navigation or multi-robot mapping. However, under challenging environment conditions such as indoor/outdoor transitions, GPS signals are partially available or not consistent over time. In this paper, a real-time tracking system for continuously locating emergency response agents in challenging conditions is presented. A cooperative localization method based on Laser-Visual-Inertial (LVI) and GPS sensors is achieved by communicating optimization events between a LiDAR-Inertial-SLAM (LI-SLAM) and Visual-Inertial-SLAM (VI-SLAM) that operate simultaneously. The estimation of the pose assisted by multiple SLAM approaches provides the GPS localization of the agent when a stand-alone GPS fails. The system has been tested under the terms of the MALIN Challenge, which aims to globally localize agents across outdoor and indoor environments under challenging conditions (such as smoked rooms, stairs, indoor/outdoor transitions, repetitive patterns, extreme lighting changes) where it is well known that a stand-alone SLAM will not be enough to maintaining the localization. The system achieved Absolute Trajectory Error of 0.48%, with a pose update rate between 15 and 20 Hz. Furthermore, the system is able to build a global consistent 3D LiDAR Map that is post-processed to create a 3D reconstruction at different level of details.</description><identifier>ISSN: 1530-437X</identifier><identifier>EISSN: 1558-1748</identifier><identifier>DOI: 10.1109/JSEN.2021.3101121</identifier><identifier>CODEN: ISJEAZ</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Automatic ; Autonomous navigation ; Communication ; Computer Science ; Computer vision ; Computer Vision and Pattern Recognition ; embedded software ; Emergency response ; Engineering Sciences ; Global positioning systems ; GPS ; Indoor environments ; indoor navigation ; Inertial sensing devices ; Laser radar ; Lidar ; Localization ; Localization method ; Location awareness ; Multiple robots ; Optimization ; Real time ; Real-time systems ; Sensor fusion ; Sensors ; Signal and Image processing ; Simultaneous localization and mapping ; terrain mapping ; Three-dimensional displays ; Tracking systems</subject><ispartof>IEEE sensors journal, 2022-03, Vol.22 (6), p.5122-5132</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022</rights><rights>Distributed under a Creative Commons Attribution 4.0 International License</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c370t-81605cf858ec98b6003aaa5f157ca75700f6661464dd4f89327f80061ea4d3803</citedby><cites>FETCH-LOGICAL-c370t-81605cf858ec98b6003aaa5f157ca75700f6661464dd4f89327f80061ea4d3803</cites><orcidid>0000-0002-2860-8128 ; 0000-0003-2088-6067 ; 0000-0002-1839-0831 ; 0000-0002-6214-4005 ; 0000-0002-3555-7306 ; 0000-0002-9863-5471 ; 0000-0001-9944-4602</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9500208$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>230,315,786,790,891,27957,27958,55147</link.rule.ids><backlink>$$Uhttps://hal.science/hal-03609471$$DView record in HAL$$Hfree_for_read</backlink></links><search><creatorcontrib>Kachurka, Viachaslau</creatorcontrib><creatorcontrib>Rault, Bastien</creatorcontrib><creatorcontrib>Ireta Munoz, Fernando I.</creatorcontrib><creatorcontrib>Roussel, David</creatorcontrib><creatorcontrib>Bonardi, Fabien</creatorcontrib><creatorcontrib>Didier, Jean-Yves</creatorcontrib><creatorcontrib>Hadj-Abdelkader, Hicham</creatorcontrib><creatorcontrib>Bouchafa, Samia</creatorcontrib><creatorcontrib>Alliez, Pierre</creatorcontrib><creatorcontrib>Robin, Maxime</creatorcontrib><title>WeCo-SLAM: Wearable Cooperative SLAM System for Real-Time Indoor Localization Under Challenging Conditions</title><title>IEEE sensors journal</title><addtitle>JSEN</addtitle><description>Real-time globally consistent GPS tracking is critical for an accurate localization and is crucial for applications such as autonomous navigation or multi-robot mapping. However, under challenging environment conditions such as indoor/outdoor transitions, GPS signals are partially available or not consistent over time. In this paper, a real-time tracking system for continuously locating emergency response agents in challenging conditions is presented. A cooperative localization method based on Laser-Visual-Inertial (LVI) and GPS sensors is achieved by communicating optimization events between a LiDAR-Inertial-SLAM (LI-SLAM) and Visual-Inertial-SLAM (VI-SLAM) that operate simultaneously. The estimation of the pose assisted by multiple SLAM approaches provides the GPS localization of the agent when a stand-alone GPS fails. The system has been tested under the terms of the MALIN Challenge, which aims to globally localize agents across outdoor and indoor environments under challenging conditions (such as smoked rooms, stairs, indoor/outdoor transitions, repetitive patterns, extreme lighting changes) where it is well known that a stand-alone SLAM will not be enough to maintaining the localization. The system achieved Absolute Trajectory Error of 0.48%, with a pose update rate between 15 and 20 Hz. Furthermore, the system is able to build a global consistent 3D LiDAR Map that is post-processed to create a 3D reconstruction at different level of details.</description><subject>Automatic</subject><subject>Autonomous navigation</subject><subject>Communication</subject><subject>Computer Science</subject><subject>Computer vision</subject><subject>Computer Vision and Pattern Recognition</subject><subject>embedded software</subject><subject>Emergency response</subject><subject>Engineering Sciences</subject><subject>Global positioning systems</subject><subject>GPS</subject><subject>Indoor environments</subject><subject>indoor navigation</subject><subject>Inertial sensing devices</subject><subject>Laser radar</subject><subject>Lidar</subject><subject>Localization</subject><subject>Localization method</subject><subject>Location awareness</subject><subject>Multiple robots</subject><subject>Optimization</subject><subject>Real time</subject><subject>Real-time systems</subject><subject>Sensor fusion</subject><subject>Sensors</subject><subject>Signal and Image processing</subject><subject>Simultaneous localization and mapping</subject><subject>terrain mapping</subject><subject>Three-dimensional displays</subject><subject>Tracking systems</subject><issn>1530-437X</issn><issn>1558-1748</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><recordid>eNo9kEFLw0AQhYMoWKs_QLwsePKQOpNNNrveSqhWiQq2pd6WbTKpKWm2blJBf70JFU8zzPvm8Xied4kwQgR1-zSbvIwCCHDEERADPPIGGEXSxziUx_3OwQ95_H7qnTXNBgBVHMUDb7OkxPqzdPx8x5ZknFlVxBJrd-RMW34R6yU2-25a2rLCOvZGpvLn5ZbYY53b7pDazFTlT0fbmi3qnBxLPkxVUb0u63XnVedlrzXn3klhqoYu_ubQW9xP5snUT18fHpNx6mc8htaXKCDKChlJypRcCQBujIkKjOLMdJkBCiEEhiLM87CQigdxIQEEkglzLoEPvZuDb5dC71y5Ne5bW1Pq6TjV_Q24ABXG-IUde31gd85-7qlp9cbuXd3F04HgSnGhRE_hgcqcbRpHxb8tgu7r1339uq9f_9Xf_Vwdfkoi-udVBBCA5L8NmH5h</recordid><startdate>20220315</startdate><enddate>20220315</enddate><creator>Kachurka, Viachaslau</creator><creator>Rault, Bastien</creator><creator>Ireta Munoz, Fernando I.</creator><creator>Roussel, David</creator><creator>Bonardi, Fabien</creator><creator>Didier, Jean-Yves</creator><creator>Hadj-Abdelkader, Hicham</creator><creator>Bouchafa, Samia</creator><creator>Alliez, Pierre</creator><creator>Robin, Maxime</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><general>Institute of Electrical and Electronics Engineers</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SP</scope><scope>7U5</scope><scope>8FD</scope><scope>L7M</scope><scope>1XC</scope><scope>VOOES</scope><orcidid>https://orcid.org/0000-0002-2860-8128</orcidid><orcidid>https://orcid.org/0000-0003-2088-6067</orcidid><orcidid>https://orcid.org/0000-0002-1839-0831</orcidid><orcidid>https://orcid.org/0000-0002-6214-4005</orcidid><orcidid>https://orcid.org/0000-0002-3555-7306</orcidid><orcidid>https://orcid.org/0000-0002-9863-5471</orcidid><orcidid>https://orcid.org/0000-0001-9944-4602</orcidid></search><sort><creationdate>20220315</creationdate><title>WeCo-SLAM: Wearable Cooperative SLAM System for Real-Time Indoor Localization Under Challenging Conditions</title><author>Kachurka, Viachaslau ; Rault, Bastien ; Ireta Munoz, Fernando I. ; Roussel, David ; Bonardi, Fabien ; Didier, Jean-Yves ; Hadj-Abdelkader, Hicham ; Bouchafa, Samia ; Alliez, Pierre ; Robin, Maxime</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c370t-81605cf858ec98b6003aaa5f157ca75700f6661464dd4f89327f80061ea4d3803</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Automatic</topic><topic>Autonomous navigation</topic><topic>Communication</topic><topic>Computer Science</topic><topic>Computer vision</topic><topic>Computer Vision and Pattern Recognition</topic><topic>embedded software</topic><topic>Emergency response</topic><topic>Engineering Sciences</topic><topic>Global positioning systems</topic><topic>GPS</topic><topic>Indoor environments</topic><topic>indoor navigation</topic><topic>Inertial sensing devices</topic><topic>Laser radar</topic><topic>Lidar</topic><topic>Localization</topic><topic>Localization method</topic><topic>Location awareness</topic><topic>Multiple robots</topic><topic>Optimization</topic><topic>Real time</topic><topic>Real-time systems</topic><topic>Sensor fusion</topic><topic>Sensors</topic><topic>Signal and Image processing</topic><topic>Simultaneous localization and mapping</topic><topic>terrain mapping</topic><topic>Three-dimensional displays</topic><topic>Tracking systems</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Kachurka, Viachaslau</creatorcontrib><creatorcontrib>Rault, Bastien</creatorcontrib><creatorcontrib>Ireta Munoz, Fernando I.</creatorcontrib><creatorcontrib>Roussel, David</creatorcontrib><creatorcontrib>Bonardi, Fabien</creatorcontrib><creatorcontrib>Didier, Jean-Yves</creatorcontrib><creatorcontrib>Hadj-Abdelkader, Hicham</creatorcontrib><creatorcontrib>Bouchafa, Samia</creatorcontrib><creatorcontrib>Alliez, Pierre</creatorcontrib><creatorcontrib>Robin, Maxime</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Xplore</collection><collection>CrossRef</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>Technology Research Database</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Hyper Article en Ligne (HAL)</collection><collection>Hyper Article en Ligne (HAL) (Open Access)</collection><jtitle>IEEE sensors journal</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Kachurka, Viachaslau</au><au>Rault, Bastien</au><au>Ireta Munoz, Fernando I.</au><au>Roussel, David</au><au>Bonardi, Fabien</au><au>Didier, Jean-Yves</au><au>Hadj-Abdelkader, Hicham</au><au>Bouchafa, Samia</au><au>Alliez, Pierre</au><au>Robin, Maxime</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>WeCo-SLAM: Wearable Cooperative SLAM System for Real-Time Indoor Localization Under Challenging Conditions</atitle><jtitle>IEEE sensors journal</jtitle><stitle>JSEN</stitle><date>2022-03-15</date><risdate>2022</risdate><volume>22</volume><issue>6</issue><spage>5122</spage><epage>5132</epage><pages>5122-5132</pages><issn>1530-437X</issn><eissn>1558-1748</eissn><coden>ISJEAZ</coden><abstract>Real-time globally consistent GPS tracking is critical for an accurate localization and is crucial for applications such as autonomous navigation or multi-robot mapping. However, under challenging environment conditions such as indoor/outdoor transitions, GPS signals are partially available or not consistent over time. In this paper, a real-time tracking system for continuously locating emergency response agents in challenging conditions is presented. A cooperative localization method based on Laser-Visual-Inertial (LVI) and GPS sensors is achieved by communicating optimization events between a LiDAR-Inertial-SLAM (LI-SLAM) and Visual-Inertial-SLAM (VI-SLAM) that operate simultaneously. The estimation of the pose assisted by multiple SLAM approaches provides the GPS localization of the agent when a stand-alone GPS fails. The system has been tested under the terms of the MALIN Challenge, which aims to globally localize agents across outdoor and indoor environments under challenging conditions (such as smoked rooms, stairs, indoor/outdoor transitions, repetitive patterns, extreme lighting changes) where it is well known that a stand-alone SLAM will not be enough to maintaining the localization. The system achieved Absolute Trajectory Error of 0.48%, with a pose update rate between 15 and 20 Hz. Furthermore, the system is able to build a global consistent 3D LiDAR Map that is post-processed to create a 3D reconstruction at different level of details.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/JSEN.2021.3101121</doi><tpages>11</tpages><orcidid>https://orcid.org/0000-0002-2860-8128</orcidid><orcidid>https://orcid.org/0000-0003-2088-6067</orcidid><orcidid>https://orcid.org/0000-0002-1839-0831</orcidid><orcidid>https://orcid.org/0000-0002-6214-4005</orcidid><orcidid>https://orcid.org/0000-0002-3555-7306</orcidid><orcidid>https://orcid.org/0000-0002-9863-5471</orcidid><orcidid>https://orcid.org/0000-0001-9944-4602</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1530-437X
ispartof IEEE sensors journal, 2022-03, Vol.22 (6), p.5122-5132
issn 1530-437X
1558-1748
language eng
recordid cdi_proquest_journals_2639936961
source IEEE Electronic Library (IEL) Journals
subjects Automatic
Autonomous navigation
Communication
Computer Science
Computer vision
Computer Vision and Pattern Recognition
embedded software
Emergency response
Engineering Sciences
Global positioning systems
GPS
Indoor environments
indoor navigation
Inertial sensing devices
Laser radar
Lidar
Localization
Localization method
Location awareness
Multiple robots
Optimization
Real time
Real-time systems
Sensor fusion
Sensors
Signal and Image processing
Simultaneous localization and mapping
terrain mapping
Three-dimensional displays
Tracking systems
title WeCo-SLAM: Wearable Cooperative SLAM System for Real-Time Indoor Localization Under Challenging Conditions
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-09-22T19%3A23%3A56IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_hal_p&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=WeCo-SLAM:%20Wearable%20Cooperative%20SLAM%20System%20for%20Real-Time%20Indoor%20Localization%20Under%20Challenging%20Conditions&rft.jtitle=IEEE%20sensors%20journal&rft.au=Kachurka,%20Viachaslau&rft.date=2022-03-15&rft.volume=22&rft.issue=6&rft.spage=5122&rft.epage=5132&rft.pages=5122-5132&rft.issn=1530-437X&rft.eissn=1558-1748&rft.coden=ISJEAZ&rft_id=info:doi/10.1109/JSEN.2021.3101121&rft_dat=%3Cproquest_hal_p%3E2639936961%3C/proquest_hal_p%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c370t-81605cf858ec98b6003aaa5f157ca75700f6661464dd4f89327f80061ea4d3803%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2639936961&rft_id=info:pmid/&rft_ieee_id=9500208&rfr_iscdi=true