高级搜索

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

基于双向LSTM的维吾尔语事件因果关系抽取

田生伟 周兴发 禹龙 冯冠军 艾山?吾买尔 李圃

田生伟, 周兴发, 禹龙, 冯冠军, 艾山?吾买尔, 李圃. 基于双向LSTM的维吾尔语事件因果关系抽取[J]. 电子与信息学报, 2018, 40(1): 200-208. doi: 10.11999/JEIT170402
引用本文: 田生伟, 周兴发, 禹龙, 冯冠军, 艾山?吾买尔, 李圃. 基于双向LSTM的维吾尔语事件因果关系抽取[J]. 电子与信息学报, 2018, 40(1): 200-208. doi: 10.11999/JEIT170402
TIAN Shengwei, ZHOU Xingfa, YU Long, FENG Guanjun, Aishan WUMAIER, LI Pu. Causal Relation Extraction of Uyghur Events Based on Bidirectional Long Short-term Memory Model[J]. Journal of Electronics & Information Technology, 2018, 40(1): 200-208. doi: 10.11999/JEIT170402
Citation: TIAN Shengwei, ZHOU Xingfa, YU Long, FENG Guanjun, Aishan WUMAIER, LI Pu. Causal Relation Extraction of Uyghur Events Based on Bidirectional Long Short-term Memory Model[J]. Journal of Electronics & Information Technology, 2018, 40(1): 200-208. doi: 10.11999/JEIT170402

基于双向LSTM的维吾尔语事件因果关系抽取

doi: 10.11999/JEIT170402
基金项目: 

国家自然科学基金(61662074, 61563051, 61262064),国家自然科学基金重点项目(61331011),新疆自治区科技人才培养项目(QN2016YX0051)

Causal Relation Extraction of Uyghur Events Based on Bidirectional Long Short-term Memory Model

Funds: 

The National Natural Science Foundation of China (61662074, 61563051, 61262064), The Key Project of National Natural Science Foundation of China (61331011), Xinjiang Uygur Autonomous Region Scientific and Technological Personnel Training Project (QN2016YX0051)

  • 摘要: 针对传统方法不能有效抽取维吾尔语事件因果关系的问题,该文提出一种基于双向LSTM(Bidirectional Long Short-Term Memory, BiLSTM)的维吾尔语事件因果关系抽取方法。通过对维吾尔语语言以及事件因果关系特点的研究,提取出10项基于事件内部结构信息的特征;同时为充分利用事件语义信息,引入词嵌入作为BiLSTM的输入,提取事件句隐含的深层语义特征并利用批样规范化(Batch Normalization, BN)算法加速BiLSTM的收敛;最后融合这两类特征作为softmax分类器的输入进而完成维吾尔语事件因果关系抽取。实验结果表明,该方法用于维吾尔语事件因果关系的抽取准确率为 89.19%, 召回率为 83.19%, F值为86.09%,证明了该文提出的方法在维吾尔语事件因果关系抽取上的有效性。
  • SHA L, LI S J, CHANG B B, et al. Joint learning teplates and slots for event schema induction[C]. Proceedings of the North American Chapter of the Association for Computational Linguistics-Human Language Technologies, California, USA, 2016: 428-434. doi: 10.18653/v1/N16-1049.
    CHEN C and NG V. Joint modeling for chinese event extraction with rich linguistic features[C]. Proceedings of COLING 2012, Mumbai, India, 2012: 529-544.
    CHEN Y, XU L, LIU K, et al. Event extraction via dynamic multi-pooling convolutional neural networks[C]. Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing, Beijing, China, 2015: 167-176. doi: 10.3115/v1/P15-1017.
    ZHANG Y J, LIU Z T, and ZHOU W, Event recognition based on deep learning in Chinese texts[J]. Plos One, 2016, 11(8): e0160147. doi: 10.1371/journal.pone.0160147.
    CHANG C Y, TENG Z Y and ZHANG Y. Expectation- regulated neural model for event mention extraction[C]. Proceedings of NAACL-HLT, California, USA, 2016: 400-410. doi: 10.18653/v1/N16-1045.
    干红华, 潘云鹤. 一种基于事件的因果关系的结构分析方法[J]. 模式识别与人工智能, 2003, 16(1): 56-62. doi: 10. 3969/j.issn.1003-6059.2003.01.011.
    Gan Honghua and Pan Yunhe. A new analysis of the structure of event causation[J]. Pattern Recognition and Artiflcial Intelligence, 2003, 16(1): 56-62. doi: 10.3969/j.issn. 1003-6059.2003.01.011.
    GIRJU R and MOLDOVAN D. Text mining for causal relations[C]. Proceedings of the 15th International Florida Artiflcial Intelligence Research Society Conference, Florida, USA, 2002: 360-364.
    MARCU D and ECHIHABI A. An unsupervised approach to recognizing discourse relations[C]. Proceedings of the 40th Annual Meeting on Association for Computational Linguistics, Philadelphia, USA, 2002: 368-375. doi: 10.3115/ 1073083.1073145.
    SORGENTE A, VETTIGLI G, and MELE F. Automatic extraction of cause-effect relations in natural language text[C]. Proceedings of the 13th Conference of the Italian Association for Artificial Intellgence, Rome, Italian, 2013: 37-48.
    付剑锋, 刘宗田, 刘炜, 等. 基于层叠条件随机场的事件因果关系抽取[J]. 模式识别与人工智能, 2011, 24(4): 567-573. doi: 10.3969/j.issn.1003-6059.2011.04.016.
    FU Jianfeng, LIU Zongtian, LIU Wei, et al. Event causal relation extraction based on cascaded conditional random flelds[J]. Pattern Recognition and Artiflcial Intelligence, 2011, 24(4): 567-573. doi: 10.3969/j.issn.1003-6059.2011.04.016.
    钟军, 禹龙, 田生伟, 等. 基于双层模型的维语突发事件因果关系抽取[J]. 自动化学报, 2014, 40(4): 771-779. doi: 10.3724/ SP.J.1004.2013.00771.
    Zhong Jun, Yu Long, Tian Shengwei, et al. Causal relation extraction of uyghur emergency events based on cascaded model[J]. Acta Automatica Sinica, 2014, 40(4): 771-779. doi: 10.3724/SP.J.1004.2013.00771.
    TANG D, QIN B, FENG X, et al. Effective LSTMs for target-dependent sentiment classification[C]. Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, Osaka, Japan, 2016: 3298-3307.
    ZHOU P, SHI W, TIAN J, et al. Attention-based bidirectional long short-term memory networks for relation classification[C]. Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, Berlin, Germany, 2016: 207-212. doi: 10.18653/v1/P162034.
    贺宇, 潘达, 付国宏. 基于自动编码特征的汉语解释性意见句识别[J]. 北京大学学报(自然科学版), 2015, 51(2): 235240. doi: 10.13209/J.0479-8023.2015.041.
    HE Yu, PAN Da and FU Guohong. Chinese explanatory opinionated sentence recognition based on autoEncoding features[J]. Acta Scientiarum Naturalium Universitatis Pekinensis, 2015, 51(2): 235-240. doi: 10.13209/J.0479-8023. 2015.041.
    MIKOLOV T, SUTSKEVER I, CHEN K, et al. Distributed representations of words and phrases and their compositionality[C]. Proceedings of Advances in Neural Information Processing Systems, Vancouve, Canada, 2013: 3111-3119.
    FENG X C, HUANG L F, TANG D Y, et al. A language-independent neural network for event detection[C]. Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, Berlin, Germany, 2016: 66-71. doi: 10.18653/v1/P16-2011.
    SUTSKEVER I, VINYALS O, and LE Q V. Sequence to sequence learning with neural networks[C]. Proceedings of Advances in Neural Information Processing Systems, Quebec, Canada, 2014: 3104-3112.
    IOFFE S and SZEGEDY C. Batch normalization: Accelerating deep network training by reducing internal covariate shift[C]. Proceedings of the 32th International Conference on Machine Learning, Lille, France, 2015: 448-456.
  • 加载中
计量
  • 文章访问数:  2184
  • HTML全文浏览量:  316
  • PDF下载量:  258
  • 被引次数: 0
出版历程
  • 收稿日期:  2017-05-02
  • 修回日期:  2017-07-19
  • 刊出日期:  2018-01-19

目录

    /

    返回文章
    返回