Loading…

Underwater Spherical Shell Classification and Parameter Estimation Based on Acoustic Backscattering Characteristics

Underwater quiet object detection and recognition by the target echo method is based on prediction and cognition of acoustic scattering characteristics. Spherical shell is a kind of common underwater quiet object whose scattering characteristics vary with the material, radius, and shell thickness. A...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2021, Vol.9, p.162756-162764
Main Authors: Xu, Tianyang, Li, Xiukun, Jia, Hongjian
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Underwater quiet object detection and recognition by the target echo method is based on prediction and cognition of acoustic scattering characteristics. Spherical shell is a kind of common underwater quiet object whose scattering characteristics vary with the material, radius, and shell thickness. According to the acoustic scattering theory, we analyze the backscattering characteristics of vacuum spherical shell under different parameters and propose a method based on deep convolution neural network trained by backscattering morphological function to classify the objects and estimate the parameters. For the performance of shell material estimation, we compare the proposed deep learning method with the traditional classification method based on feature engineering, and the proposed method has better performance. For objects with different geometric scales, the estimation results of outer radius and shell thickness conform to the fitting formula based on medium frequency enhancement effect. The deep learning classification method based on acoustic scattering morphological function covers a large number of parameters by establishing a stable object feature library, and realize the accurate classification of underwater spherical shell objects.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2020.3046364