Loading…

Including sensor bias in shape from motion calibration and sensor fusion

Shape from motion data fusion brings a greater degree of autonomy and sensor integration to intelligent systems in which fusion by constant linear transformations is appropriate. To illustrate this, we apply shape from motion techniques to applications involving both similar and disparate sensory in...

Full description

Saved in:
Bibliographic Details
Main Authors: Voyles, R.M., Merrow, J.D., Khosla, P.K.
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page 99
container_issue
container_start_page 93
container_title
container_volume
creator Voyles, R.M.
Merrow, J.D.
Khosla, P.K.
description Shape from motion data fusion brings a greater degree of autonomy and sensor integration to intelligent systems in which fusion by constant linear transformations is appropriate. To illustrate this, we apply shape from motion techniques to applications involving both similar and disparate sensory information vectors. First, nearly autonomous force/torque sensor calibration is demonstrated through fusion of the individual channels of raw strain gauge data. Gathering only the raw sensor signals, the motion of the force vector (the "motion") and the calibration matrix (the "shape") are simultaneously extracted by singular value decomposition. This calibration example is provided to simply explain the mathematics. Disparate sensory information is fused in a "primordial learning" mobile robot through a similar eigenspace representation. This paper summarizes these shape from motion applications and presents an extension for simultaneously extracting sensor bias.
doi_str_mv 10.1109/MFI.1996.568505
format conference_proceeding
fullrecord <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_568505</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>568505</ieee_id><sourcerecordid>568505</sourcerecordid><originalsourceid>FETCH-LOGICAL-i145t-c18c9c03330b735228230c5f751e70b6cc3517883afc63c5c2238e27fc5f98663</originalsourceid><addsrcrecordid>eNo1T8FKAzEUDIig1p4FT_mBXV_y-pLsUYq1CxUvCt5KNk00spstSXvw711sncsMw7zHDGN3AmohoHl4WbW1aBpVkzIEdMFuQBtA1AAfV2xeyjdMWJAQRNds3SbXH3cxffLiUxkz76ItPCZevuze85DHgQ_jIY6JO9vHLts_bdPu_yAcy-Tcsstg--LnZ56x99XT23JdbV6f2-XjpopiQYfKCeMaN9VB6DSSlEYiOAqahNfQKeeQhDYGbXAKHTkp0Xipw5RpjFI4Y_env9F7v93nONj8sz1txV_EXElL</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Including sensor bias in shape from motion calibration and sensor fusion</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Voyles, R.M. ; Merrow, J.D. ; Khosla, P.K.</creator><creatorcontrib>Voyles, R.M. ; Merrow, J.D. ; Khosla, P.K.</creatorcontrib><description>Shape from motion data fusion brings a greater degree of autonomy and sensor integration to intelligent systems in which fusion by constant linear transformations is appropriate. To illustrate this, we apply shape from motion techniques to applications involving both similar and disparate sensory information vectors. First, nearly autonomous force/torque sensor calibration is demonstrated through fusion of the individual channels of raw strain gauge data. Gathering only the raw sensor signals, the motion of the force vector (the "motion") and the calibration matrix (the "shape") are simultaneously extracted by singular value decomposition. This calibration example is provided to simply explain the mathematics. Disparate sensory information is fused in a "primordial learning" mobile robot through a similar eigenspace representation. This paper summarizes these shape from motion applications and presents an extension for simultaneously extracting sensor bias.</description><identifier>ISBN: 078033700X</identifier><identifier>ISBN: 9780780337008</identifier><identifier>DOI: 10.1109/MFI.1996.568505</identifier><language>eng</language><publisher>IEEE</publisher><subject>Calibration ; Data mining ; Force sensors ; Intelligent sensors ; Intelligent systems ; Sensor fusion ; Sensor systems ; Shape ; Torque ; Vectors</subject><ispartof>1996 IEEE/SICE/RSJ International Conference on Multisensor Fusion and Integration for Intelligent Systems (Cat. No.96TH8242), 1996, p.93-99</ispartof><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/568505$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>310,311,786,790,795,796,2071,4069,4070,27958,55271</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/568505$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Voyles, R.M.</creatorcontrib><creatorcontrib>Merrow, J.D.</creatorcontrib><creatorcontrib>Khosla, P.K.</creatorcontrib><title>Including sensor bias in shape from motion calibration and sensor fusion</title><title>1996 IEEE/SICE/RSJ International Conference on Multisensor Fusion and Integration for Intelligent Systems (Cat. No.96TH8242)</title><addtitle>MFI</addtitle><description>Shape from motion data fusion brings a greater degree of autonomy and sensor integration to intelligent systems in which fusion by constant linear transformations is appropriate. To illustrate this, we apply shape from motion techniques to applications involving both similar and disparate sensory information vectors. First, nearly autonomous force/torque sensor calibration is demonstrated through fusion of the individual channels of raw strain gauge data. Gathering only the raw sensor signals, the motion of the force vector (the "motion") and the calibration matrix (the "shape") are simultaneously extracted by singular value decomposition. This calibration example is provided to simply explain the mathematics. Disparate sensory information is fused in a "primordial learning" mobile robot through a similar eigenspace representation. This paper summarizes these shape from motion applications and presents an extension for simultaneously extracting sensor bias.</description><subject>Calibration</subject><subject>Data mining</subject><subject>Force sensors</subject><subject>Intelligent sensors</subject><subject>Intelligent systems</subject><subject>Sensor fusion</subject><subject>Sensor systems</subject><subject>Shape</subject><subject>Torque</subject><subject>Vectors</subject><isbn>078033700X</isbn><isbn>9780780337008</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>1996</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><recordid>eNo1T8FKAzEUDIig1p4FT_mBXV_y-pLsUYq1CxUvCt5KNk00spstSXvw711sncsMw7zHDGN3AmohoHl4WbW1aBpVkzIEdMFuQBtA1AAfV2xeyjdMWJAQRNds3SbXH3cxffLiUxkz76ItPCZevuze85DHgQ_jIY6JO9vHLts_bdPu_yAcy-Tcsstg--LnZ56x99XT23JdbV6f2-XjpopiQYfKCeMaN9VB6DSSlEYiOAqahNfQKeeQhDYGbXAKHTkp0Xipw5RpjFI4Y_env9F7v93nONj8sz1txV_EXElL</recordid><startdate>1996</startdate><enddate>1996</enddate><creator>Voyles, R.M.</creator><creator>Merrow, J.D.</creator><creator>Khosla, P.K.</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>1996</creationdate><title>Including sensor bias in shape from motion calibration and sensor fusion</title><author>Voyles, R.M. ; Merrow, J.D. ; Khosla, P.K.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i145t-c18c9c03330b735228230c5f751e70b6cc3517883afc63c5c2238e27fc5f98663</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>1996</creationdate><topic>Calibration</topic><topic>Data mining</topic><topic>Force sensors</topic><topic>Intelligent sensors</topic><topic>Intelligent systems</topic><topic>Sensor fusion</topic><topic>Sensor systems</topic><topic>Shape</topic><topic>Torque</topic><topic>Vectors</topic><toplevel>online_resources</toplevel><creatorcontrib>Voyles, R.M.</creatorcontrib><creatorcontrib>Merrow, J.D.</creatorcontrib><creatorcontrib>Khosla, P.K.</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Xplore</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Voyles, R.M.</au><au>Merrow, J.D.</au><au>Khosla, P.K.</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Including sensor bias in shape from motion calibration and sensor fusion</atitle><btitle>1996 IEEE/SICE/RSJ International Conference on Multisensor Fusion and Integration for Intelligent Systems (Cat. No.96TH8242)</btitle><stitle>MFI</stitle><date>1996</date><risdate>1996</risdate><spage>93</spage><epage>99</epage><pages>93-99</pages><isbn>078033700X</isbn><isbn>9780780337008</isbn><abstract>Shape from motion data fusion brings a greater degree of autonomy and sensor integration to intelligent systems in which fusion by constant linear transformations is appropriate. To illustrate this, we apply shape from motion techniques to applications involving both similar and disparate sensory information vectors. First, nearly autonomous force/torque sensor calibration is demonstrated through fusion of the individual channels of raw strain gauge data. Gathering only the raw sensor signals, the motion of the force vector (the "motion") and the calibration matrix (the "shape") are simultaneously extracted by singular value decomposition. This calibration example is provided to simply explain the mathematics. Disparate sensory information is fused in a "primordial learning" mobile robot through a similar eigenspace representation. This paper summarizes these shape from motion applications and presents an extension for simultaneously extracting sensor bias.</abstract><pub>IEEE</pub><doi>10.1109/MFI.1996.568505</doi><tpages>7</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier ISBN: 078033700X
ispartof 1996 IEEE/SICE/RSJ International Conference on Multisensor Fusion and Integration for Intelligent Systems (Cat. No.96TH8242), 1996, p.93-99
issn
language eng
recordid cdi_ieee_primary_568505
source IEEE Electronic Library (IEL) Conference Proceedings
subjects Calibration
Data mining
Force sensors
Intelligent sensors
Intelligent systems
Sensor fusion
Sensor systems
Shape
Torque
Vectors
title Including sensor bias in shape from motion calibration and sensor fusion
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-09-21T20%3A06%3A30IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Including%20sensor%20bias%20in%20shape%20from%20motion%20calibration%20and%20sensor%20fusion&rft.btitle=1996%20IEEE/SICE/RSJ%20International%20Conference%20on%20Multisensor%20Fusion%20and%20Integration%20for%20Intelligent%20Systems%20(Cat.%20No.96TH8242)&rft.au=Voyles,%20R.M.&rft.date=1996&rft.spage=93&rft.epage=99&rft.pages=93-99&rft.isbn=078033700X&rft.isbn_list=9780780337008&rft_id=info:doi/10.1109/MFI.1996.568505&rft_dat=%3Cieee_6IE%3E568505%3C/ieee_6IE%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-i145t-c18c9c03330b735228230c5f751e70b6cc3517883afc63c5c2238e27fc5f98663%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=568505&rfr_iscdi=true