Loading…

A Joint Extrinsic Calibration Tool for Radar, Camera and Lidar

We address joint extrinsic calibration of lidar, camera and radar sensors. To simplify calibration, we propose a single calibration target design for all three modalities, and implement our approach in an open-source tool with bindings to Robot Operating System (ROS). Our tool features three optimiz...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on intelligent vehicles 2021-09, Vol.6 (3), p.571-582
Main Authors: Domhof, Joris, Kooij, Julian F. P., Gavrila, Dariu M.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c333t-4a6c486def1c7705fd22e1566ccf2096d3fa624870cfab5928ad3e6ae19e10bb3
cites cdi_FETCH-LOGICAL-c333t-4a6c486def1c7705fd22e1566ccf2096d3fa624870cfab5928ad3e6ae19e10bb3
container_end_page 582
container_issue 3
container_start_page 571
container_title IEEE transactions on intelligent vehicles
container_volume 6
creator Domhof, Joris
Kooij, Julian F. P.
Gavrila, Dariu M.
description We address joint extrinsic calibration of lidar, camera and radar sensors. To simplify calibration, we propose a single calibration target design for all three modalities, and implement our approach in an open-source tool with bindings to Robot Operating System (ROS). Our tool features three optimization configurations, namely using error terms for a minimal number of sensor pairs, or using terms for all sensor pairs in combination with loop closure constraints, or by adding terms for structure estimation in a probabilistic model. Apart from relative calibration where relative transformations between sensors are computed, our work also addresses absolute calibration that includes calibration with respect to the mobile robot's body. Two methods are compared to estimate the body reference frame using an external laser scanner, one based on markers and the other based on manual annotation of the laser scan. In the experiments, we evaluate the three configurations for relative calibration . Our results show that using terms for all sensor pairs is most robust, especially for lidar to radar, when minimum five board locations are used. For absolute calibration   the median rotation error around the vertical axis reduces from 1^\circ before calibration, to 0.33^\circ using the markers and 0.02^\circ with manual annotations.
doi_str_mv 10.1109/TIV.2021.3065208
format article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2565236540</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9380784</ieee_id><sourcerecordid>2565236540</sourcerecordid><originalsourceid>FETCH-LOGICAL-c333t-4a6c486def1c7705fd22e1566ccf2096d3fa624870cfab5928ad3e6ae19e10bb3</originalsourceid><addsrcrecordid>eNo9kEFLAzEQhYMoWGrvgpeAV7dOkk02uQilVK0UBKleQzabQEq7qckW9N-bUvU0w8x785gPoWsCU0JA3a-XH1MKlEwZCE5BnqERZY2qpIL6_K-XXF6iSc4bACBCUglqhB5m-CWGfsCLryGFPgeL52Yb2mSGEHu8jnGLfUz4zXQm3ZXdziWDTd_hVSiTK3ThzTa7yW8do_fHxXr-XK1en5bz2aqyjLGhqo2wtRSd88Q2DXDfUeoIF8JaT0GJjnkjaC0bsN60XFFpOuaEcUQ5Am3Lxuj2dHef4ufB5UFv4iH1JVJTXl5mgtdQVHBS2RRzTs7rfQo7k741AX0EpQsofQSlf0EVy83JEpxz_3LFJDSyZj-_BWIh</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2565236540</pqid></control><display><type>article</type><title>A Joint Extrinsic Calibration Tool for Radar, Camera and Lidar</title><source>IEEE Electronic Library (IEL) Journals</source><creator>Domhof, Joris ; Kooij, Julian F. P. ; Gavrila, Dariu M.</creator><creatorcontrib>Domhof, Joris ; Kooij, Julian F. P. ; Gavrila, Dariu M.</creatorcontrib><description><![CDATA[We address joint extrinsic calibration of lidar, camera and radar sensors. To simplify calibration, we propose a single calibration target design for all three modalities, and implement our approach in an open-source tool with bindings to Robot Operating System (ROS). Our tool features three optimization configurations, namely using error terms for a minimal number of sensor pairs, or using terms for all sensor pairs in combination with loop closure constraints, or by adding terms for structure estimation in a probabilistic model. Apart from relative calibration where relative transformations between sensors are computed, our work also addresses absolute calibration that includes calibration with respect to the mobile robot's body. Two methods are compared to estimate the body reference frame using an external laser scanner, one based on markers and the other based on manual annotation of the laser scan. In the experiments, we evaluate the three configurations for relative calibration . Our results show that using terms for all sensor pairs is most robust, especially for lidar to radar, when minimum five board locations are used. For absolute calibration   the median rotation error around the vertical axis reduces from 1<inline-formula><tex-math notation="LaTeX">^\circ</tex-math></inline-formula> before calibration, to 0.33<inline-formula><tex-math notation="LaTeX">^\circ</tex-math></inline-formula> using the markers and 0.02<inline-formula><tex-math notation="LaTeX">^\circ</tex-math></inline-formula> with manual annotations.]]></description><identifier>ISSN: 2379-8858</identifier><identifier>EISSN: 2379-8904</identifier><identifier>DOI: 10.1109/TIV.2021.3065208</identifier><identifier>CODEN: ITIVBL</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Annotations ; Calibration ; camera ; Cameras ; Configurations ; intelligent vehicles ; Laser radar ; Lidar ; Markers ; Optimization ; Probabilistic models ; radar ; Robot sensing systems ; Robot vision systems ; Robots ; ROS ; Sensors ; Source code</subject><ispartof>IEEE transactions on intelligent vehicles, 2021-09, Vol.6 (3), p.571-582</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c333t-4a6c486def1c7705fd22e1566ccf2096d3fa624870cfab5928ad3e6ae19e10bb3</citedby><cites>FETCH-LOGICAL-c333t-4a6c486def1c7705fd22e1566ccf2096d3fa624870cfab5928ad3e6ae19e10bb3</cites><orcidid>0000-0003-4956-490X ; 0000-0001-9919-0710 ; 0000-0002-1810-4196</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9380784$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>315,786,790,27957,27958,55147</link.rule.ids></links><search><creatorcontrib>Domhof, Joris</creatorcontrib><creatorcontrib>Kooij, Julian F. P.</creatorcontrib><creatorcontrib>Gavrila, Dariu M.</creatorcontrib><title>A Joint Extrinsic Calibration Tool for Radar, Camera and Lidar</title><title>IEEE transactions on intelligent vehicles</title><addtitle>TIV</addtitle><description><![CDATA[We address joint extrinsic calibration of lidar, camera and radar sensors. To simplify calibration, we propose a single calibration target design for all three modalities, and implement our approach in an open-source tool with bindings to Robot Operating System (ROS). Our tool features three optimization configurations, namely using error terms for a minimal number of sensor pairs, or using terms for all sensor pairs in combination with loop closure constraints, or by adding terms for structure estimation in a probabilistic model. Apart from relative calibration where relative transformations between sensors are computed, our work also addresses absolute calibration that includes calibration with respect to the mobile robot's body. Two methods are compared to estimate the body reference frame using an external laser scanner, one based on markers and the other based on manual annotation of the laser scan. In the experiments, we evaluate the three configurations for relative calibration . Our results show that using terms for all sensor pairs is most robust, especially for lidar to radar, when minimum five board locations are used. For absolute calibration   the median rotation error around the vertical axis reduces from 1<inline-formula><tex-math notation="LaTeX">^\circ</tex-math></inline-formula> before calibration, to 0.33<inline-formula><tex-math notation="LaTeX">^\circ</tex-math></inline-formula> using the markers and 0.02<inline-formula><tex-math notation="LaTeX">^\circ</tex-math></inline-formula> with manual annotations.]]></description><subject>Annotations</subject><subject>Calibration</subject><subject>camera</subject><subject>Cameras</subject><subject>Configurations</subject><subject>intelligent vehicles</subject><subject>Laser radar</subject><subject>Lidar</subject><subject>Markers</subject><subject>Optimization</subject><subject>Probabilistic models</subject><subject>radar</subject><subject>Robot sensing systems</subject><subject>Robot vision systems</subject><subject>Robots</subject><subject>ROS</subject><subject>Sensors</subject><subject>Source code</subject><issn>2379-8858</issn><issn>2379-8904</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><recordid>eNo9kEFLAzEQhYMoWGrvgpeAV7dOkk02uQilVK0UBKleQzabQEq7qckW9N-bUvU0w8x785gPoWsCU0JA3a-XH1MKlEwZCE5BnqERZY2qpIL6_K-XXF6iSc4bACBCUglqhB5m-CWGfsCLryGFPgeL52Yb2mSGEHu8jnGLfUz4zXQm3ZXdziWDTd_hVSiTK3ThzTa7yW8do_fHxXr-XK1en5bz2aqyjLGhqo2wtRSd88Q2DXDfUeoIF8JaT0GJjnkjaC0bsN60XFFpOuaEcUQ5Am3Lxuj2dHef4ufB5UFv4iH1JVJTXl5mgtdQVHBS2RRzTs7rfQo7k741AX0EpQsofQSlf0EVy83JEpxz_3LFJDSyZj-_BWIh</recordid><startdate>20210901</startdate><enddate>20210901</enddate><creator>Domhof, Joris</creator><creator>Kooij, Julian F. P.</creator><creator>Gavrila, Dariu M.</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SP</scope><scope>8FD</scope><scope>L7M</scope><orcidid>https://orcid.org/0000-0003-4956-490X</orcidid><orcidid>https://orcid.org/0000-0001-9919-0710</orcidid><orcidid>https://orcid.org/0000-0002-1810-4196</orcidid></search><sort><creationdate>20210901</creationdate><title>A Joint Extrinsic Calibration Tool for Radar, Camera and Lidar</title><author>Domhof, Joris ; Kooij, Julian F. P. ; Gavrila, Dariu M.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c333t-4a6c486def1c7705fd22e1566ccf2096d3fa624870cfab5928ad3e6ae19e10bb3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Annotations</topic><topic>Calibration</topic><topic>camera</topic><topic>Cameras</topic><topic>Configurations</topic><topic>intelligent vehicles</topic><topic>Laser radar</topic><topic>Lidar</topic><topic>Markers</topic><topic>Optimization</topic><topic>Probabilistic models</topic><topic>radar</topic><topic>Robot sensing systems</topic><topic>Robot vision systems</topic><topic>Robots</topic><topic>ROS</topic><topic>Sensors</topic><topic>Source code</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Domhof, Joris</creatorcontrib><creatorcontrib>Kooij, Julian F. P.</creatorcontrib><creatorcontrib>Gavrila, Dariu M.</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEL</collection><collection>CrossRef</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>Advanced Technologies Database with Aerospace</collection><jtitle>IEEE transactions on intelligent vehicles</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Domhof, Joris</au><au>Kooij, Julian F. P.</au><au>Gavrila, Dariu M.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A Joint Extrinsic Calibration Tool for Radar, Camera and Lidar</atitle><jtitle>IEEE transactions on intelligent vehicles</jtitle><stitle>TIV</stitle><date>2021-09-01</date><risdate>2021</risdate><volume>6</volume><issue>3</issue><spage>571</spage><epage>582</epage><pages>571-582</pages><issn>2379-8858</issn><eissn>2379-8904</eissn><coden>ITIVBL</coden><abstract><![CDATA[We address joint extrinsic calibration of lidar, camera and radar sensors. To simplify calibration, we propose a single calibration target design for all three modalities, and implement our approach in an open-source tool with bindings to Robot Operating System (ROS). Our tool features three optimization configurations, namely using error terms for a minimal number of sensor pairs, or using terms for all sensor pairs in combination with loop closure constraints, or by adding terms for structure estimation in a probabilistic model. Apart from relative calibration where relative transformations between sensors are computed, our work also addresses absolute calibration that includes calibration with respect to the mobile robot's body. Two methods are compared to estimate the body reference frame using an external laser scanner, one based on markers and the other based on manual annotation of the laser scan. In the experiments, we evaluate the three configurations for relative calibration . Our results show that using terms for all sensor pairs is most robust, especially for lidar to radar, when minimum five board locations are used. For absolute calibration   the median rotation error around the vertical axis reduces from 1<inline-formula><tex-math notation="LaTeX">^\circ</tex-math></inline-formula> before calibration, to 0.33<inline-formula><tex-math notation="LaTeX">^\circ</tex-math></inline-formula> using the markers and 0.02<inline-formula><tex-math notation="LaTeX">^\circ</tex-math></inline-formula> with manual annotations.]]></abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/TIV.2021.3065208</doi><tpages>12</tpages><orcidid>https://orcid.org/0000-0003-4956-490X</orcidid><orcidid>https://orcid.org/0000-0001-9919-0710</orcidid><orcidid>https://orcid.org/0000-0002-1810-4196</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2379-8858
ispartof IEEE transactions on intelligent vehicles, 2021-09, Vol.6 (3), p.571-582
issn 2379-8858
2379-8904
language eng
recordid cdi_proquest_journals_2565236540
source IEEE Electronic Library (IEL) Journals
subjects Annotations
Calibration
camera
Cameras
Configurations
intelligent vehicles
Laser radar
Lidar
Markers
Optimization
Probabilistic models
radar
Robot sensing systems
Robot vision systems
Robots
ROS
Sensors
Source code
title A Joint Extrinsic Calibration Tool for Radar, Camera and Lidar
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-09-23T06%3A33%3A09IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20Joint%20Extrinsic%20Calibration%20Tool%20for%20Radar,%20Camera%20and%20Lidar&rft.jtitle=IEEE%20transactions%20on%20intelligent%20vehicles&rft.au=Domhof,%20Joris&rft.date=2021-09-01&rft.volume=6&rft.issue=3&rft.spage=571&rft.epage=582&rft.pages=571-582&rft.issn=2379-8858&rft.eissn=2379-8904&rft.coden=ITIVBL&rft_id=info:doi/10.1109/TIV.2021.3065208&rft_dat=%3Cproquest_cross%3E2565236540%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c333t-4a6c486def1c7705fd22e1566ccf2096d3fa624870cfab5928ad3e6ae19e10bb3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2565236540&rft_id=info:pmid/&rft_ieee_id=9380784&rfr_iscdi=true