JOURNAL ARTICLE

Underwater Acoustic Target Recognition Combining Multi-scale Features and Attention Mechanism

Abstract

As countries continue to advance and deepen ocean strategies, the field of underwater acoustics has gradually become a research hotspot. Underwater acoustic target recognition is the research focus in the field of underwater acoustics, and it is also the technical difficulty. Because the traditional underwater acoustic target recognition method has insufficient feature extraction, this paper uses a feature fusion method combined with deep learning to better extract features. The fusion feature is the feature obtained by splicing the improved logarithmic Mel spectrum difference parameter with the original signal parameter; the deep learning method includes three parts: multi-scale convolution feature extraction, self-attention module, and central loss function. The multi-scale convolution part performs more fine-grained feature extraction; the self-attention module part improves the feature expression ability of key channels; the central loss function part solves the problem of not compact features within the class. The structure of the network uses CNN for underwater acoustic target recognition. Through the comparison of ablation experiments, the accuracy of identification on the ShipsEar dataset is 98.6%.

Keywords:
Underwater Feature extraction Computer science Artificial intelligence Convolution (computer science) Pattern recognition (psychology) Feature (linguistics) Logarithm Speech recognition Artificial neural network Mathematics Geology

Metrics

5
Cited By
1.47
FWCI (Field Weighted Citation Impact)
14
Refs
0.81
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Underwater Acoustics Research
Physical Sciences →  Earth and Planetary Sciences →  Oceanography
Marine animal studies overview
Physical Sciences →  Environmental Science →  Ecology
Speech and Audio Processing
Physical Sciences →  Computer Science →  Signal Processing
© 2026 ScienceGate Book Chapters — All rights reserved.