高级搜索

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

融合记忆力和自注意力机制的脑电图情绪识别模型

刘善锐 闭应洲 霍雷刚 甘秋静 周淑姮

刘善锐, 闭应洲, 霍雷刚, 甘秋静, 周淑姮. 融合记忆力和自注意力机制的脑电图情绪识别模型[J]. 电子与信息学报. doi: 10.11999/JEIT250737
引用本文: 刘善锐, 闭应洲, 霍雷刚, 甘秋静, 周淑姮. 融合记忆力和自注意力机制的脑电图情绪识别模型[J]. 电子与信息学报. doi: 10.11999/JEIT250737
LIU Shanrui, BI Yingzhou, HUO Leigang, GAN Qiujing, ZHOU shuheng. An EEG Emotion Recognition Model Integrating Memory and Self-Attention Mechanisms[J]. Journal of Electronics & Information Technology. doi: 10.11999/JEIT250737
Citation: LIU Shanrui, BI Yingzhou, HUO Leigang, GAN Qiujing, ZHOU shuheng. An EEG Emotion Recognition Model Integrating Memory and Self-Attention Mechanisms[J]. Journal of Electronics & Information Technology. doi: 10.11999/JEIT250737

融合记忆力和自注意力机制的脑电图情绪识别模型

doi: 10.11999/JEIT250737 cstr: 32379.14.JEIT250737
基金项目: 国家自然科学基金项目(62067007),广西学位与研究生教改课题(JGY2023236)
详细信息
    作者简介:

    刘善锐:男,硕士生,研究方向为人工智能、脑电图EEG

    闭应洲:男,博士,教授,研究方向为智能计算、人工智能

    霍雷刚:男,博士,副教授,研究方向为图像处理、人工智能

    甘秋静:女,硕士生,研究方向为人工智能、图像处理

    周淑姮:女,硕士生,研究方向为人工智能、图像处理

    通讯作者:

    闭应洲 byzhou@163.com

  • 中图分类号: TP391.7; R604

An EEG Emotion Recognition Model Integrating Memory and Self-Attention Mechanisms

Funds: National Natural Science Foundation of China(62067007), Innovation Project of Guangxi Graduate Education(JGY2023236)
  • 摘要: 脑电图(Electroencephalography, EEG)作为一种非侵入式的神经信号获取手段,蕴含丰富的情感和认知信息,在脑科学研究与情感识别中具有广泛应用。当前 Transformer 在脑电图情绪识别中虽具备良好的全局建模能力,但其多头自注意力机制忽略了脑电图是由大脑活动产生的数据,会有遗忘效应,人在当前时刻会对其它时刻产生的状态产生遗忘,而目前的Transformer仅关注当前时刻与其它时刻相关性大小,忽略了遗忘效应,限制了模型在脑电图情绪识别中的作用。因此,亟需设计一种兼顾相关性大小与遗忘效应的脑电图情绪识别模型,提出一种融合记忆力遗忘机制和自注意力机制的脑电情感识别模型(Memory Self-Attention,MSA),以兼顾相关性大小的同时注入符合人类遗忘机制的遗忘机制,在几乎不增加额外参数和计算量的情况下提升模型的性能。该模型首先利用聚合卷积神经网络ACNN (Aggregated Convolutional Neural Network)聚合各通道的时空特征,再借助MSA结构建模全局依赖关系和记忆关系。再通过分类头得出最终分类结果。在DEAP二分类任务中,MSA模型在效价和唤醒维度上分别获得98.87%与98.30%的分类准确率;在SEED三分类任务中达到97.64%的分类准确率,在SEED-IV四分类任务中获得95.90%的准确率,均优于现有主流方法。实验结果验证了所提模型在多类别情感识别任务中的有效性与鲁棒性。
  • 图  1  模型思路图

    图  2  MSA模型总体框架图

    图  3  DEAP数据集的训练过程和混淆矩阵图

    图  4  SEED数据集的训练过程和混淆矩阵图

    图  5  SEED-IV数据集的训练过程和混淆矩阵图

    图  6  不同深度下原始Transformer与引入记忆遗忘机制的MSA模型分类准确率对比图

    图  7  五折交叉验证对比实验结果

    表  1  标准自注意力机制MHSA与MSA的参数量与计算量对比

    项目 MHSA MSA 增量分析与说明
    参数量 3demb2+demb2 3demb2+demb2+4H 仅增加 4H 个可学习参数(随头数线性增长,4 个参数/头)
    计算量 O(HN2dk) O(HN2dk)+ O(HN2) 增加一项加法广播,计算开销极小
    参数占比 - $ \dfrac{4H}{3{{{\mathrm{d}}_{\text{emb}}}}^{2}+{{{\mathrm{d}}_{\text{emb}}}}^{2}}=\dfrac{H}{{{{\mathrm{d}}_{\text{emb}}}}^{2}}\approx 0 $ 与嵌入维度相比可忽略
    计算开销占比 - $ \dfrac{1}{{d}_{k}}=\dfrac{H}{{d}_{emb}}\approx 0 $ 与嵌入维度相比可忽略
    下载: 导出CSV

    表  3  DEAP数据集实验结果

    年份方法效价分类准确率/%唤醒分类准确率/%
    2023SSTD[28]68.2871.48
    2023MTLFuseNet[29]71.3373.28
    2023Caps-EEGNet[30]96.6796.75
    2024DSE-Mixer[31]95.1995.22
    2024MES-CTNet[20]98.3198.28
    20253DCNN-CapsNet[32]97.4097.20
    -Transformer96.5897.07
    -MSA98.8798.30
    下载: 导出CSV

    表  4  SEED数据集实验结果

    年份方法准确率/%
    2022RGNN[33]94.24
    2022Conformer[19]95.30
    2024FTCN[34]89.16
    2025EmoSTT[35]92.67
    -Transformer96.16
    -MSA97.64
    下载: 导出CSV

    表  5  SEED-IV数据集实验结果

    年份方法准确率/%
    2022RGNN[33]79.37
    20224D-aNN[36]86.77
    2022EeT[37]83.27
    2023MFFNN[38]87.32
    20243D-CTM[21]87.44
    2024BF-GCN[39]82.03
    -Transformer92.49
    -MSA95.90
    下载: 导出CSV
  • [1] DOLAN R J. Emotion, cognition, and behavior[J]. Science, 2002, 298(5596): 1191–1194. doi: 10.1126/science.1076358.
    [2] FRANTZIDIS C A, BRATSAS C, PAPADELIS C L, et al. Toward emotion aware computing: An integrated approach using multichannel neurophysiological recordings and affective visual stimuli[J]. IEEE Transactions on Information Technology in Biomedicine, 2010, 14(3): 589–597. doi: 10.1109/TITB.2010.2041553.
    [3] ZHANG Tong, ZHENG Wenming, CUI Zhen, et al. A deep neural network-driven feature learning method for multi-view facial expression recognition[J]. IEEE Transactions on Multimedia, 2016, 18(12): 2528–2536. doi: 10.1109/TMM.2016.2598092.
    [4] LIU Zhentao, XIE Qiao, WU Min, et al. Speech emotion recognition based on an improved brain emotion learning model[J]. Neurocomputing, 2018, 309: 145–156. doi: 10.1016/j.neucom.2018.05.005.
    [5] BRITTON J C, PHAN K L, TAYLOR S F, et al. Neural correlates of social and nonsocial emotions: An fMRI study[J]. NeuroImage, 2006, 31(1): 397–409. doi: 10.1016/j.neuroimage.2005.11.027.
    [6] ZHANG Tong, WANG Xuehan, XU Xiangmin, et al. GCB-Net: Graph convolutional broad network and its application in emotion recognition[J]. IEEE Transactions on Affective Computing, 2022, 13(1): 379–388. doi: 10.1109/TAFFC.2019.2937768.
    [7] ZHENG Weilong, ZHU Jiayi, and LU Baoliang. Identifying stable patterns over time for emotion recognition from EEG[J]. IEEE Transactions on Affective Computing, 2019, 10(3): 417–429. doi: 10.1109/TAFFC.2017.2712143.
    [8] ALARCÃO S M and FONSECA M J. Emotions recognition using EEG signals: A survey[J]. IEEE Transactions on Affective Computing, 2019, 10(3): 374–393. doi: 10.1109/TAFFC.2017.2714671.
    [9] LI Xiang, SONG Dawei, ZHANG Peng, et al. Emotion recognition from multi-channel EEG data through convolutional recurrent neural network[C]. Proceedings of 2016 IEEE International Conference on Bioinformatics and Biomedicine (BIBM), Shenzhen, China, 2016: 352–359. doi: 10.1109/BIBM.2016.7822545.
    [10] CHEN Jingxia, JIANG D M, and ZHANG Y N. A hierarchical bidirectional GRU model with attention for EEG-based emotion classification[J]. IEEE Access, 2019, 7: 118530–118540. doi: 10.1109/ACCESS.2019.2936817.
    [11] DU Xiaobing, MA Cuixia, ZHANG Guanhua, et al. An efficient LSTM network for emotion recognition from multichannel EEG signals[J]. IEEE Transactions on Affective Computing, 2022, 13(3): 1528–1540. doi: 10.1109/TAFFC.2020.3013711.
    [12] PARK H J and FRISTON K. Structural and functional brain networks: From connections to cognition[J]. Science, 2013, 342(6158): 1238411. doi: 10.1126/science.1238411.
    [13] SONG Tengfei, ZHENG Wenming, SONG Peng, et al. EEG emotion recognition using dynamical graph convolutional neural networks[J]. IEEE Transactions on Affective Computing, 2020, 11(3): 532–541. doi: 10.1109/TAFFC.2018.2817622.
    [14] YE Weishan, ZHANG Zhiguo, TENG Fei, et al. Semi-supervised dual-stream self-attentive adversarial graph contrastive learning for cross-subject EEG-based emotion recognition[J]. IEEE Transactions on Affective Computing, 2025, 16(1): 290–305. doi: 10.1109/TAFFC.2024.3433470.
    [15] LI Yang, ZHENG Wenming, CUI Zhen, et al. A novel neural network model based on cerebral hemispheric asymmetry for EEG emotion recognition[C]. Proceedings of the 27th International Joint Conference on Artificial Intelligence, Stockholm, Sweden, 2018: 1561–1567. doi: 10.24963/ijcai.2018/216.
    [16] LI Yang, WANG Lei, ZHENG Wenming, et al. A novel Bi-hemispheric discrepancy model for EEG emotion recognition[J]. IEEE Transactions on Cognitive and Developmental Systems, 2021, 13(2): 354–367. doi: 10.1109/TCDS.2020.2999337.
    [17] SONG Tengfei, ZHENG Wenming, LIU Suyuan, et al. Graph-embedded convolutional neural network for image-based EEG emotion recognition[J]. IEEE Transactions on Emerging Topics in Computing, 2022, 10(3): 1399–1413. doi: 10.1109/TETC.2021.3087174.
    [18] LIU Shuaiqi, ZHAO Yingying, AN Yanling, et al. GLFANet: A global to local feature aggregation network for EEG emotion recognition[J]. Biomedical Signal Processing and Control, 2023, 85: 104799. doi: 10.1016/j.bspc.2023.104799.
    [19] SONG Yonghao, ZHENG Qingqing, LIU Bingchuan, et al. EEG conformer: Convolutional transformer for EEG decoding and visualization[J]. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2023, 31: 710–719. doi: 10.1109/TNSRE.2022.3230250.
    [20] DU Yuxiao, DING Han, WU Min, et al. MES-CTNet: A novel capsule transformer network base on a multi-domain feature map for electroencephalogram-based emotion recognition[J]. Brain Sciences, 2024, 14(4): 344. doi: 10.3390/brainsci14040344.
    [21] 张学军, 王天晨, 王泽田. 基于多域信息融合的卷积Transformer脑电情感识别模型[J]. 数据采集与处理, 2024, 39(6): 1543–1552. doi: 10.16337/j.1004-9037.2024.06.021.

    ZHANG Xuejun, WANG Tianchen, and WANG Zetian. Convolutional Transformer EEG emotion recognition model based on multi-domain information fusion[J]. Journal of Data Acquisition and Processing, 2024, 39(6): 1543–1552. doi: 10.16337/j.1004-9037.2024.06.021.
    [22] ZHANG Yong, JI Xiaomin, and ZHANG Suhua. An approach to EEG-based emotion recognition using combined feature extraction method[J]. Neuroscience Letters, 2016, 633: 152–157. doi: 10.1016/j.neulet.2016.09.037.
    [23] JENKE R, PEER A, and BUSS M. Feature extraction and selection for emotion recognition from EEG[J]. IEEE Transactions on Affective Computing, 2014, 5(3): 327–339. doi: 10.1109/TAFFC.2014.2339834.
    [24] CHEN Kun, CHAI Shulong, CAI Mincheng, et al. A novel 3D feature fusion network for EEG emotion recognition[J]. Biomedical Signal Processing and Control, 2025, 102: 107347. doi: 10.1016/j.bspc.2024.107347.
    [25] KOELSTRA S, MUHL C, SOLEYMANI M, et al. DEAP: A database for emotion analysis; using physiological signals[J]. IEEE Transactions on Affective Computing, 2012, 3(1): 18–31. doi: 10.1109/T-AFFC.2011.15.
    [26] ZHENG Weilong and LU Baoliang. Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks[J]. IEEE Transactions on Autonomous Mental Development, 2015, 7(3): 162–175. doi: 10.1109/TAMD.2015.2431497.
    [27] ZHENG Weilong, LIU Wei, LU Yifei, et al. EmotionMeter: A multimodal framework for recognizing human emotions[J]. IEEE Transactions on Cybernetics, 2019, 49(3): 1110–1122. doi: 10.1109/TCYB.2018.2797176.
    [28] LI Rui, REN Chao, LI Chen, et al. SSTD: A novel spatio-temporal demographic network for EEG-based emotion recognition[J]. IEEE Transactions on Computational Social Systems, 2023, 10(1): 376–387. doi: 10.1109/TCSS.2022.3188891.
    [29] LI Rui, REN Chao, GE Yiqing, et al. MTLFuseNet: A novel emotion recognition model based on deep latent feature fusion of EEG signals and multi-task learning[J]. Knowledge-Based Systems, 2023, 276: 110756. doi: 10.1016/j.knosys.2023.110756.
    [30] CHEN Kun, JING Huchuan, LIU Quan, et al. A novel caps-EEGNet combined with channel selection for EEG-based emotion recognition[J]. Biomedical Signal Processing and Control, 2023, 86: 105312. doi: 10.1016/j.bspc.2023.105312.
    [31] LIN Kai, ZHANG Linhang, CAI Jing, et al. DSE-Mixer: A pure multilayer perceptron network for emotion recognition from EEG feature maps[J]. Journal of Neuroscience Methods, 2024, 401: 110008. doi: 10.1016/j.jneumeth.2023.110008.
    [32] CHEN Kun, RUAN Wenhao, LIU Quan, et al. A novel deep learning model combining 3DCNN-CapsNet and hierarchical attention mechanism for EEG emotion recognition[J]. Neural Networks, 2025, 186: 107267. doi: 10.1016/j.neunet.2025.107267.
    [33] ZHONG Peixiang, WANG Di, MIAO Chunyan, et al. EEG-based emotion recognition using regularized graph neural networks[J]. IEEE Transactions on Affective Computing, 2022, 13(3): 1290–1301. doi: 10.1109/TAFFC.2020.2994159.
    [34] YANG Lijun, WANG Yixin, OUYANG Rujie, et al. Electroencephalogram-based emotion recognition using factorization temporal separable convolution network[J]. Engineering Applications of Artificial Intelligence, 2024, 133: 108011. doi: 10.1016/j.engappai.2024.108011.
    [35] LI Ming, YU Peng, and SHEN Yang. A spatial and temporal transformer-based EEG emotion recognition in VR environment[J]. Frontiers in Human Neuroscience, 2025, 19: 1517273. doi: 10.3389/fnhum.2025.1517273.
    [36] XIAO Guowen, SHI Meng, YE Mengwen, et al. 4D attention-based neural network for EEG emotion recognition[J]. Cognitive Neurodynamics, 2022, 16(4): 805–818. doi: 10.1007/s11571-021-09751-5.
    [37] LIU Jiyao, WU Hao, ZHANG Li, et al. Spatial-temporal transformers for EEG emotion recognition[C]. Proceedings of the 6th International Conference on Advances in Artificial Intelligence, Birmingham, UK, 2023: 116–120. doi: 10.1145/3571560.3571577.
    [38] LI Menghang, QIU Min, KONG Wanzeng, et al. Fusion graph representation of EEG for emotion recognition[J]. Sensors, 2023, 23(3): 1404. doi: 10.3390/s23031404.
    [39] LI Cunbo, TANG Tian, PAN Yue, et al. An efficient graph learning system for emotion recognition inspired by the cognitive prior graph of EEG brain network[J]. IEEE Transactions on Neural Networks and Learning Systems, 2025, 36(4): 7130–7144. doi: 10.1109/TNNLS.2024.3405663.
  • 加载中
图(7) / 表(4)
计量
  • 文章访问数:  32
  • HTML全文浏览量:  18
  • PDF下载量:  3
  • 被引次数: 0
出版历程
  • 修回日期:  2025-12-17
  • 录用日期:  2025-12-17
  • 网络出版日期:  2025-12-25

目录

    /

    返回文章
    返回