Advanced Search
Turn off MathJax
Article Contents
CHANG Huaizhao, GU Yingyan, HAN Yunzhi, JIN Benzhou. Convolutional Mixed Multi-Attention Encoder-Decoder Network for Radar Signal Sorting[J]. Journal of Electronics & Information Technology. doi: 10.11999/JEIT251031
Citation: CHANG Huaizhao, GU Yingyan, HAN Yunzhi, JIN Benzhou. Convolutional Mixed Multi-Attention Encoder-Decoder Network for Radar Signal Sorting[J]. Journal of Electronics & Information Technology. doi: 10.11999/JEIT251031

Convolutional Mixed Multi-Attention Encoder-Decoder Network for Radar Signal Sorting

doi: 10.11999/JEIT251031 cstr: 32379.14.JEIT251031
Funds:  The National Natural Science Foundation of China (62371230)
  • Received Date: 2025-09-30
  • Accepted Date: 2025-12-29
  • Rev Recd Date: 2025-12-26
  • Available Online: 2026-01-05
  •   Objective  Radar signal sorting is a fundamental technology for electromagnetic environment awareness and electronic warfare systems. The objective of this study is to develop an effective radar signal sorting method that accurately separates intercepted pulse sequences and assigns them to different radiation sources in complex electromagnetic environments. With the increasing complexity of modern radar systems, intercepted pulse sequences are severely affected by pulse overlap, pulse loss, false pulses, and pulse arrival time measurement errors, which substantially reduce the performance of conventional sorting approaches. Therefore, a robust signal sorting framework that maintains high accuracy under non-ideal conditions is required.  Methods  Radar signal sorting in complex electromagnetic environments is formulated as a pulse-level time-series semantic segmentation problem, where each pulse is treated as the minimum processing unit and classified in an end-to-end manner. Under this formulation, sorting is achieved through unified sequence modeling and label prediction without explicit pulse subsequence extraction or iterative stripping procedures, which reduces error accumulation. To address this task, a convolutional mixed multi-attention encoder-decoder network is proposed (Fig. 1). The network consists of an encoder-decoder backbone, a local attention module, and a feature selection module. The encoder-decoder backbone adopts a symmetric structure with progressive downsampling and upsampling to aggregate contextual information while restoring pulse-level temporal resolution. Its core component is a dual-branch dilated bottleneck module (Fig. 2), in which a 1*1 temporal convolution is applied for channel projection. Two parallel dilated convolution branches with different dilation rates are then employed to construct multi-scale receptive fields, which enable simultaneous modeling of short-term local variations and long-term modulation patterns across multiple pulses and ensure robust temporal representation under pulse time shifts and missing pulses. To enhance long-range dependency modeling beyond convolutional operations, a local Transformer module is inserted between the encoder and the decoder. By applying local self-attention to temporally downsampled feature maps, temporal dependencies among pulses are captured with reduced computational complexity, whereas the influence of false and missing pulses is suppressed during feature aggregation. In addition, a feature selection module is integrated into skip connections to reduce feature redundancy and interference (Fig. 3). Through hybrid attention across temporal and channel dimensions, multi-level features are adaptively filtered and fused to emphasize discriminative information for radiation source identification. During training, focal loss is applied to alleviate class imbalance and improve the discrimination of difficult and boundary pulses.  Results and Discussions  Experimental results demonstrate that the proposed network achieves pulse-level fine-grained classification for radar signal sorting and outperforms mainstream baseline methods across various complex scenarios. Compared with existing approaches, an average sorting accuracy improvement of more than 6% is obtained under moderate interference conditions. In MultiFunctional Radar (MFR) overlapping scenarios, recall rates of 88.30%, 85.48%, 86.89%, and 86.48% are achieved for four different MFRs, respectively, with an overall average accuracy of 86.82%. For different pulse repetition interval modulation types, recall rates exceed 90% for fixed patterns and remain above 85% for jittered, staggered, and group-varying modes. In staggered and group-varying cases, performance improvements exceeding 3.5% relative to baseline methods are observed. Generalization experiments indicate that high accuracy is maintained under parameter distribution shifts of 5% and 15%, which demonstrates strong robustness to distribution perturbations (Fig. 8). Ablation studies confirm the effectiveness of each proposed module in improving overall performance (Table 7).  Conclusions  A convolutional mixed multi-attention encoder-decoder network is proposed for radar signal sorting in complex electromagnetic environments. By modeling radar signal sorting as a pulse-level time-series semantic segmentation task and integrating multi-scale dilated convolutions, local attention modeling, and adaptive feature selection, high sorting accuracy, robustness, and generalization capability are achieved under severe interference conditions. The experimental results indicate that the proposed approach provides an effective and practical solution for radar signal sorting in complex electromagnetic environments.
  • loading
  • [1]
    WANG Shiqiang, HU Guoping, ZHANG Qiliang, et al. The background and significance of radar signal sorting research in modern warfare[J]. Procedia Computer Science, 2019, 154: 519–523. doi: 10.1016/j.procs.2019.06.080.
    [2]
    HAIGH K and ANDRUSENKO J. Cognitive Electronic Warfare: An Artificial Intelligence Approach[M]. Norwood: Artech House, 2021: 239.
    [3]
    LANG Ping, FU Xiongjun, DONG Jian, et al. A novel radar signals sorting method via residual graph convolutional network[J]. IEEE Signal Processing Letters, 2023, 30: 753–757. doi: 10.1109/LSP.2023.3287404.
    [4]
    WAN Liangtian, LIU Rong, SUN Lu, et al. UAV Swarm based radar signal sorting via multi-source data fusion: A deep transfer learning framework[J]. Information Fusion, 2022, 78: 90–101. doi: 10.1016/j.inffus.2021.09.007.
    [5]
    ZHOU Zixiang, FU Xiongjun, DONG Jian, et al. Radar signal sorting with multiple self-attention coupling mechanism based transformer network[J]. IEEE Signal Processing Letters, 2024, 31: 1765–1769. doi: 10.1109/LSP.2024.3421948.
    [6]
    CAO Sheng, WANG Shucheng, and ZHANG Yan. Density-based fuzzy C-means multi-center re-clustering radar signal sorting algorithm[C].17th IEEE International Conference on Machine Learning and Applications, Orlando, USA, 2018: 891–896. doi: 10.1109/ICMLA.2018.00144.
    [7]
    SU Yuhang, CHEN Zhao, GONG Linfu, et al. An improved adaptive radar signal sorting algorithm based on DBSCAN by a novel CVI[J]. IEEE Access, 2024, 12: 43139–43154. doi: 10.1109/ACCESS.2024.3361221.
    [8]
    AHMED M G S and TANG B. New FCM's validity index for sorting radar signal[C]. IEEE 17th International Conference on Computational Science and Engineering, Chengdu, China, 2014: 127–131. doi: 10.1109/CSE.2014.55.
    [9]
    ZHU Mengtao, LI Yunjie, and WANG Shafei. Model-based time series clustering and interpulse modulation parameter estimation of multifunction radar pulse sequences[J]. IEEE Transactions on Aerospace and Electronic Systems, 2021, 57(6): 3673–3690. doi: 10.1109/TAES.2021.3082660.
    [10]
    WEI Xiuxi, PENG Maosong, HUANG Huajuan, et al. An overview on density peaks clustering[J]. Neurocomputing, 2023, 554: 126633. doi: 10.1016/j.neucom.2023.126633.
    [11]
    LIU Zhangmeng and YU P S. Classification, denoising, and deinterleaving of pulse streams with recurrent neural networks[J]. IEEE Transactions on Aerospace and Electronic Systems, 2019, 55(4): 1624–1639. doi: 10.1109/TAES.2018.2874139.
    [12]
    NOTARO P, PASCHALI M, HOPKE C, et al. Radar emitter classification with attribute-specific recurrent neural networks[J/OL]. arXiv preprint arXiv: 1911.07683, 2019. doi: 10.48550/arXiv.1911.07683.
    [13]
    ZHANG Jiaxiang, WANG Bo, HAN Xinrui, et al. A multi-radar emitter sorting and recognition method based on hierarchical clustering and TFCN[J]. Digital Signal Processing, 2025, 160: 105005. doi: 10.1016/j.dsp.2025.105005.
    [14]
    WANG Haojian, TAO Zhenji, HE Jin, et al. Deinterleaving of intercepted radar pulse streams via temporal convolutional attention network[J]. IEEE Transactions on Aerospace and Electronic Systems, 2025, 61(4): 9327–9343. doi: 10.1109/TAES.2025.3555244.
    [15]
    AL-MALAHI A, FARHAN A, FENG Hancong, et al. An intelligent radar signal classification and deinterleaving method with unified residual recurrent neural network[J]. IET Radar, Sonar & Navigation, 2023, 17(8): 1259–1276. doi: 10.1049/rsn2.12417.
    [16]
    HE Kaiming, ZHANG Xiangyu, REN Shaoqing, et al. Deep residual learning for image recognition[C]. The IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, USA, 2016: 770–778. doi: 10.1109/CVPR.2016.90.
    [17]
    WANG Qilong, WU Banggu, ZHU Pengfei, et al. ECA-Net: Efficient channel attention for deep convolutional neural networks[C]. The IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, USA, 2020: 11531–11539. doi: 10.1109/CVPR42600.2020.01155.
    [18]
    HU Jie, SHEN Li, and SUN Gang. Squeeze-and-excitation networks[C]. 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, USA, 2018: 7132–7141. doi: 10.1109/CVPR.2018.00745.
    [19]
    LIN T Y, GOYAL P, GIRSHICK R, et al. Focal loss for dense object detection[C]. IEEE International Conference on Computer Vision, Venice, Italy, 2017: 2999–3007. doi: 10.1109/ICCV.2017.324.
    [20]
    ELSAYED N, MAIDA A S, and BAYOUMI M. Deep gated recurrent and convolutional network hybrid model for univariate time series classification[J/OL]. arXiv preprint arXiv: 1812.07683, 2018. doi: 10.48550/arXiv.1812.07683.
    [21]
    CHEN Hongyu, FENG Kangan, KONG Yukai, et al. Multi-function radar work mode recognition based on encoder-decoder model[C]. IGARSS 2022—2022 IEEE International Geoscience and Remote Sensing Symposium, Kuala Lumpur, Malaysia, 2022: 1189–1192. doi: 10.1109/IGARSS46834.2022.9884556.
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(8)  / Tables(7)

    Article Metrics

    Article views (116) PDF downloads(18) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return