Advanced Search
Volume 46 Issue 1
Jan.  2024
Turn off MathJax
Article Contents
XU Yuanchao, CAI Zhiming, KONG Xiaopeng, HUANG Yan. Visualization Analysis and Kernel Pruning of Convolutional Neural Network for Ship-Radiated Noise Classification[J]. Journal of Electronics & Information Technology, 2024, 46(1): 74-82. doi: 10.11999/JEIT230149
Citation: XU Yuanchao, CAI Zhiming, KONG Xiaopeng, HUANG Yan. Visualization Analysis and Kernel Pruning of Convolutional Neural Network for Ship-Radiated Noise Classification[J]. Journal of Electronics & Information Technology, 2024, 46(1): 74-82. doi: 10.11999/JEIT230149

Visualization Analysis and Kernel Pruning of Convolutional Neural Network for Ship-Radiated Noise Classification

doi: 10.11999/JEIT230149
  • Received Date: 2023-03-13
  • Rev Recd Date: 2023-06-12
  • Available Online: 2023-06-19
  • Publish Date: 2024-01-17
  • Current research on the classification of ship-radiated noise utilizing deep neural networks primarily focuses on aspects of classification performance and disregards model interpretation. To address this issue, an approach involving guided backwardpropagation and input space optimization has been utilized to develop a Convolutional Neural Network (CNN) for ship-radiated noise classification. This CNN takes a logarithmic scale spectrum as input and is based on the DeepShip dataset, thus presenting a visualization method for ship-radiated noise classification. Results reveal that the multiframe feature alignment algorithm enhances the visualization effect, and the deep convolutional kernel detects two types of features: line spectrum and background. Notably, the line spectrum has been identified as a reliable feature for ship classification. Therefore, a convolutional kernel pruning method has been proposed. This approach not only enhances the performance of CNN classification, but also enhances the stability of the training process. The results of the guided backwardpropagation visualization suggest that the post-pruning CNN increasingly emphasizes the consideration of line spectrum information.
  • loading
  • [1]
    SHEN Sheng, YANG Honghui, LI Junhao, et al. Auditory inspired convolutional neural networks for ship type classification with raw hydrophone data[J]. Entropy, 2018, 20(12): 990. doi: 10.3390/e20120990
    [2]
    HU Gang, WANG Kejun, PENG Yuan, et al. Deep learning methods for underwater target feature extraction and recognition[J]. Computational Intelligence and Neuroscience, 2018, 2018: 1214301. doi: 10.1155/2018/1214301
    [3]
    LI Junhao and YANG Honghui. The underwater acoustic target timbre perception and recognition based on the auditory inspired deep convolutional neural network[J]. Applied Acoustics, 2021, 182: 108210. doi: 10.1016/j.apacoust.2021.108210
    [4]
    CHEN Yuechao and SHANG Jintao. Underwater target recognition method based on convolution autoencoder[C]. Proceedings of 2019 IEEE International Conference on Signal, Information and Data Processing, Chongqing, China, 2019: 1–5.
    [5]
    CHEN Jie, HAN Bing, MA Xufeng, et al. Underwater target recognition based on multi-decision LOFAR spectrum enhancement: A deep-learning approach[J]. Future Internet, 2021, 13(10): 265. doi: 10.3390/FI13100265
    [6]
    ZHANG Qi, DA Lianglong, ZHANG Yanhou, et al. Integrated neural networks based on feature fusion for underwater target recognition[J]. Applied Acoustics, 2021, 182: 108261. doi: 10.1016/J.APACOUST.2021.108261
    [7]
    伊恩·古德费洛, 约书亚·本吉奥, 亚伦·库维尔, 赵申剑, 黎彧君, 符天凡, 等译. 深度学习[M]. 北京: 人民邮电出版社, 2017: 224–225.

    GOODFELLOW I, BENGIO Y, COURVILLE A, ZHAO Shenjian, LI Yujun, FU Tianfan, et al. translation. Deep Learning[M]. Beijing: Posts & Telecom Press, 2017: 224–225.
    [8]
    ZEILER M D and FERGUS R. Visualizing and understanding convolutional networks[C]. Proceedings of the 13th European Conference on Computer Vision, Zurich, Switzerland, 2014: 818–833.
    [9]
    SPRINGENBERG J T, DOSOVITSKIY A, BROX T, et al. Striving for simplicity: The all convolutional net[EB/OL]. http://arxiv.org/abs/1412.6806, 2015.
    [10]
    SIMONYAN K, VEDALDI A, and ZISSERMAN A. Deep inside convolutional networks: Visualising image classification models and saliency maps[EB/OL]. http://arxiv.org/abs/1312.6034, 2014.
    [11]
    YOSINSKI J, CLUNE J, NGUYEN A, et al. Understanding neural networks through deep visualization[EB/OL]. http://arxiv.org/abs/1506.06579, 2015.
    [12]
    徐源超, 蔡志明, 孔晓鹏. 基于双对数谱和卷积网络的船舶辐射噪声分类[J]. 电子与信息学报, 2022, 44(6): 1947–1955. doi: 10.11999/JEIT211407

    XU Yuanchao, CAI Zhiming, and KONG Xiaopeng. Classification of ship radiated noise based on Bi-logarithmic scale spectrum and convolutional network[J]. Journal of Electronics &Information Technology, 2022, 44(6): 1947–1955. doi: 10.11999/JEIT211407
    [13]
    IRFAN M, ZHENG Jiangbin, ALI S, et al. DeepShip: An underwater acoustic benchmark dataset and a separable convolution based autoencoder for classification[J]. Expert Systems with Applications, 2021, 183: 115270. doi: 10.1016/J.ESWA.2021.115270
    [14]
    赖叶静, 郝珊锋, 黄定江. 深度神经网络模型压缩方法与进展[J]. 华东师范大学学报:自然科学版, 2020(5): 68–82. doi: 10.3969/j.issn.1000-5641.202091001

    LAI Yejing, HAO Shanfeng, and HUANG Dingjiang. Methods and progress in deep neural network model compression[J]. Journal of East China Normal University:Natural Science, 2020(5): 68–82. doi: 10.3969/j.issn.1000-5641.202091001
    [15]
    姜晓勇, 李忠义, 黄朗月, 等. 神经网络剪枝技术研究综述[J]. 应用科学学报, 2022, 40(5): 838–849. doi: 10.3969/j.issn.0255-8297.2022.05.013

    JIANG Xiaoyong, LI Zhongyi, HUANG Langyue, et al. Review of neural network pruning techniques[J]. Journal of Applied Sciences, 2022, 40(5): 838–849. doi: 10.3969/j.issn.0255-8297.2022.05.013
    [16]
    HU Hengyuan, PENG Rui, TAI Y W, et al. Network trimming: A data-driven neuron pruning approach towards efficient deep architectures[EB/OL]. http://arxiv.org/abs/1607.03250, 2016.
    [17]
    程玉胜, 李智忠, 邱家兴. 水声目标识别[M]. 北京: 科学出版社, 2018: 8–9.

    CHENG Yusheng, LI Zhizhong, and QIU Jiaxing. Underwater Acoustic Target Recognition[M]. Beijing: Science Press, 2018: 8–9.
    [18]
    SRIVASTAVA N, HINTON G, KRIZHEVSKY A, et al. Dropout: A simple way to prevent neural networks from overfitting[J]. Journal of Machine Learning Research, 2014, 15(56): 1929–1958.
    [19]
    鲍雪山. 被动目标DEMON检测方法研究及处理系统方案设计[D]. [硕士论文], 哈尔滨工程大学, 2005.

    BAO Xueshan. The method research of passive target detection using DEMON spectrum and the project design of processing system[D]. [Master dissertation], Harbin Engineering University, 2005.
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(7)  / Tables(5)

    Article Metrics

    Article views (237) PDF downloads(52) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return