高级搜索

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

面向方面级情感分析的交互图注意力网络模型

韩虎 吴渊航 秦晓雅

韩虎, 吴渊航, 秦晓雅. 面向方面级情感分析的交互图注意力网络模型[J]. 电子与信息学报, 2021, 43(11): 3282-3290. doi: 10.11999/JEIT210036
引用本文: 韩虎, 吴渊航, 秦晓雅. 面向方面级情感分析的交互图注意力网络模型[J]. 电子与信息学报, 2021, 43(11): 3282-3290. doi: 10.11999/JEIT210036
Hu HAN, Yuanhang WU, Xiaoya QIN. An Interactive Graph Attention Networks Model for Aspect-level Sentiment Analysis[J]. Journal of Electronics & Information Technology, 2021, 43(11): 3282-3290. doi: 10.11999/JEIT210036
Citation: Hu HAN, Yuanhang WU, Xiaoya QIN. An Interactive Graph Attention Networks Model for Aspect-level Sentiment Analysis[J]. Journal of Electronics & Information Technology, 2021, 43(11): 3282-3290. doi: 10.11999/JEIT210036

面向方面级情感分析的交互图注意力网络模型

doi: 10.11999/JEIT210036
基金项目: 国家自然科学基金(62166024),国家社会科学基金(17BXW071)
详细信息
    作者简介:

    韩虎:男,1977年生,教授,研究方向为神经网络与深度学习、数据挖掘与自然语言处理

    吴渊航:男,1997年生,硕士生,研究方向为深度学习与自然语言处理

    秦晓雅:女,1996年生,硕士生,研究方向为深度学习与自然语言处理

    通讯作者:

    吴渊航  1903552800@qq.com

  • 中图分类号: TN912

An Interactive Graph Attention Networks Model for Aspect-level Sentiment Analysis

Funds: The National Natural Science Foundation of China (62166024), The National Social Science Foundation of China (17BXW071)
  • 摘要: 方面级情感分析目前主要采用注意力机制与传统神经网络相结合的方法对方面与上下文词进行建模。这类方法忽略了句子中方面与上下文词之间的句法依存信息及位置信息,从而导致注意力权重分配不合理。为此,该文提出一种面向方面级情感分析的交互图注意力网络模型(IGATs)。该模型首先使用双向长短期记忆网络(BiLSTM)学习句子的语义特征表示,并结合位置信息生成新的句子特征表示,然后在新生成的特征表示上构建图注意力网络以捕获句法依存信息,再通过交互注意力机制建模方面与上下文词之间的语义关系,最后利用softmax进行分类输出。在3个公开数据集上的实验结果表明,与其他现有模型相比,IGATs的准确率与宏平均F1值均得到显著提升。
  • 图  1  句法依存树

    图  2  IGATs模型

    图  3  GAT层数L对模型性能的影响

    图  4  GAT头数K对模型性能的影响

    表  1  数据集统计

    数据集积极中性消极
    Twitter-train156131271560
    Twitter-test173346173
    Laptop-train994464870
    Laptop-test341169128
    Restaurant-train2164637807
    Restaurant-test728196196
    下载: 导出CSV

    表  2  实验平台

    实验环境具体信息
    操作系统Windows 10 教育版
    CPUIntel(R) Core(TM) i7-7700 CPU @ 3.60 GHz
    内存16.0 GB
    显卡GTX 1080
    显存8.0 GB
    下载: 导出CSV

    表  3  超参数设置

    超参数超参数值数量
    词嵌入维度300
    隐藏状态向量维度300
    Batch size16
    训练迭代次数epoch100
    优化器OptimizerAdam
    学习率Learning rate0.001
    Dropout rate0.3
    L2正则化系数0.00001
    下载: 导出CSV

    表  4  各个模型的性能对比(%)

    模型TwitterLaptopRestaurant
    准确率(Acc)宏平均F1准确率(Acc)宏平均F1准确率(Acc)宏平均F1
    SVM63.4063.3070.49N/A80.16N/A
    LSTM69.5667.7069.2863.0978.1367.47
    MemNet71.4869.9070.6465.1779.6169.64
    IAN72.5070.8172.0567.3879.2670.09
    AOA72.3070.2072.6267.5279.9770.42
    AOA-MultiACIA72.4069.4075.2770.2482.5972.13
    ASGCN72.1570.4075.5571.0580.7772.02
    GATs73.1271.2574.6170.5180.6370.41
    IGATs75.2973.4076.0272.0582.3273.99
    下载: 导出CSV

    表  5  各个模型的可训练参数数量(M)

    模型可训练参数数量
    SVM
    LSTM0.72
    MemNet0.36
    IAN2.17
    AOA2.10
    ASGCN2.17
    GATs1.81
    IGATs1.81
    下载: 导出CSV

    表  6  消融研究(%)

    模型TwitterLaptopRestaurant
    准确率(Acc)宏平均F1准确率(Acc)宏平均F1准确率(Acc)宏平均F1
    BiLSTM+IAtt74.1372.8675.0870.8281.2572.14
    BiLSTM+GAT+IAtt74.8672.9874.9271.0882.0573.45
    BiLSTM+PE+IAtt74.4272.3576.6572.7582.2374.01
    IGATs75.2973.4076.0272.0582.3273.99
    下载: 导出CSV
  • [1] PONTIKI M, GALANIS D, PAVLOPOULOS J, et al. Semeval-2014 task 4: Aspect based sentiment analysis[C]. The 8th International Workshop on Semantic Evaluation (SemEval 2014), Dublin, Ireland, 2014: 27–35. doi: 10.3115/v1/S14-2004.
    [2] DING Xiaowen, LIU Bing, and YU P S. A holistic lexicon-based approach to opinion mining[C]. 2008 International Conference on Web Search and Data Mining, Palo Alto, USA, 2008: 231–240. doi: 10.1145/1341531.1341561.
    [3] JIANG Long, YU Mo, ZHOU Ming, et al. Target-dependent twitter sentiment classification[C]. The 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, Portland, USA, 2011: 151–160.
    [4] KIRITCHENKO S, ZHU Xiaodan, CHERRY C, et al. NRC-Canada-2014: Detecting aspects and sentiment in customer reviews[C]. The 8th International Workshop on Semantic Evaluation (SemEval 2014), Dublin, Ireland, 2014: 437–442. doi: 10.3115/v1/S14-2076.
    [5] PAN S T, HUANG Zonghong, YUAN S S, et al. Application of hidden Markov models in speech command recognition[J]. Journal of Mechanics Engineering and Automation, 2020, 10(2): 41–45. doi: 10.17265/2159-5275/2020.02.001
    [6] WANG Lei. Application research of deep convolutional neural network in computer vision[J]. Journal of Networking and Telecommunications, 2020, 2(2): 23–29. doi: 10.18282/jnt.v2i2.886
    [7] DONG Li, WEI Furu, TAN Chuanqi, et al. Adaptive recursive neural network for target-dependent twitter sentiment classification[C]. The 52nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), Baltimore, USA, 2014: 49–54. doi: 10.3115/v1/P14-2009.
    [8] XUE Wei and LI Tao. Aspect based sentiment analysis with gated convolutional networks[C]. The 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Melbourne, Australia, 2018: 2514–2523. doi: 10.18653/v1/P18-1234.
    [9] TANG Duyu, QIN Bing, FENG Xiaocheng, et al. Effective LSTMs for target-dependent sentiment classification[C]. COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, Osaka, Japan, 2016: 3298–3307.
    [10] WANG Yequan, HUANG Minlie, ZHU Xiaoyan, et al. Attention-based LSTM for aspect-level sentiment classification[C]. 2016 Conference on Empirical Methods in Natural Language Processing, Austin, USA, 2016: 606–615. doi: 10.18653/v1/D16-1058.
    [11] TANG Duyu, QIN Bing, and LIU Ting. Aspect level sentiment classification with deep memory network[C]. 2016 Conference on Empirical Methods in Natural Language Processing, Austin, USA, 2016: 214–224. doi: 10.18653/v1/D16-1021.
    [12] MA Dehong, LI Sujian, ZHANG Xiaodong, et al. Interactive attention networks for aspect-level sentiment classification[C]. The 26th International Joint Conference on Artificial Intelligence, Melbourne, Australia, 2017: 4068–4074. doi: 10.24963/ijcai.2017/568.
    [13] HUANG Binxuan, OU Yanglan, and CARLEY K M. Aspect level sentiment classification with attention-over-attention neural networks[C]. The 11th International Conference on Social, Cultural, and Behavioral Modeling, Washington, USA, 2018: 197–206. doi: 10.1007/978-3-319-93372-6_22.
    [14] WU Zhuojia, LI Yang, LIAO Jian, et al. Aspect-context interactive attention representation for aspect-level sentiment classification[J]. IEEE Access, 2020, 8: 29238–29248. doi: 10.1109/ACCESS.2020.2972697
    [15] ZHANG Chen, LI Qiuchi, and SONG Dawei. Aspect-based sentiment classification with aspect-specific graph convolutional networks[C]. The 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), Hong Kong, China, 2019: 4568–4578. doi: 10.18653/v1/D19-1464.
    [16] ZHAO Pinlong, HOU Linlin, and WU Ou. Modeling sentiment dependencies with graph convolutional networks for aspect-level sentiment classification[J]. Knowledge-Based Systems, 2020, 193: 105443. doi: 10.1016/j.knosys.2019.105443
    [17] GU Shuqin, ZHANG Lipeng, HOU Yuexian, et al. A position-aware bidirectional attention network for aspect-level sentiment[C]. The 27th International Conference on Computational Linguistics, Santa Fe, USA, 2018: 774–784.
    [18] 苏锦钿, 欧阳志凡, 余珊珊. 基于依存树及距离注意力的句子属性情感分类[J]. 计算机研究与发展, 2019, 56(8): 1731–1745. doi: 10.7544/issn1000-1239.2019.20190102

    SU Jindian, OUYANG Zhifan, and YU Shanshan. Aspect-level sentiment classification for sentences based on dependency tree and distance attention[J]. Journal of Computer Research and Development, 2019, 56(8): 1731–1745. doi: 10.7544/issn1000-1239.2019.20190102
    [19] VELIČKOVIĆ P, CUCURULL G, CASANOVA A, et al. Graph attention networks[C]. The 6th International Conference on Learning Representations, Vancouver, Canada, 2018: 1–12.
    [20] PENNINGTON J, SOCHER R, and MANNING C. Glove: Global vectors for word representation[C]. 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), Doha, Qatar, 2014: 1532–1543. doi: 10.3115/v1/D14-1162.
  • 加载中
图(4) / 表(6)
计量
  • 文章访问数:  1246
  • HTML全文浏览量:  1077
  • PDF下载量:  193
  • 被引次数: 0
出版历程
  • 收稿日期:  2021-01-11
  • 修回日期:  2021-09-27
  • 网络出版日期:  2021-10-09
  • 刊出日期:  2021-11-23

目录

    /

    返回文章
    返回