Advanced Search
Volume 44 Issue 6
Jun.  2022
Turn off MathJax
Article Contents
WU Zheng, CHEN Hongchang, ZHANG Jianpeng. Link Prediction in Knowledge Graphs Based on Hyperbolic Graph Attention Networks[J]. Journal of Electronics & Information Technology, 2022, 44(6): 2184-2194. doi: 10.11999/JEIT210321
Citation: WU Zheng, CHEN Hongchang, ZHANG Jianpeng. Link Prediction in Knowledge Graphs Based on Hyperbolic Graph Attention Networks[J]. Journal of Electronics & Information Technology, 2022, 44(6): 2184-2194. doi: 10.11999/JEIT210321

Link Prediction in Knowledge Graphs Based on Hyperbolic Graph Attention Networks

doi: 10.11999/JEIT210321
Funds:  The National Natural Science Foundation for Young Scholars (62002384), The Collaborative Innovation Program of Zhengzhou (162/32410218), The General Program of China Postdoctoral Science Foundation (47698)
  • Received Date: 2021-04-16
  • Accepted Date: 2022-01-22
  • Rev Recd Date: 2022-02-28
  • Available Online: 2022-02-14
  • Publish Date: 2022-06-21
  • Most existing knowledge representation learning models treat knowledge triples independently, it fail to cover and leverage the feature information in any given entity’s neighborhood. Besides, embedding knowledge graphs with tree-like hierarchical structure in Euclidean space would incur a large distortion in embeddings. To tackle such issues, a link prediction method based on Hyperbolic Graph ATtention networks for Link Prediction in knowledge graphs (HyGAT-LP) is proposed. Firstly, knowledge graphs are embedded in hyperbolic space with constant negative curvature, which is more suited for knowledge graphs’ tree-like hierarchical structure. Then the proposed method aggregates feature information in the given entity’s neighborhood with both entity-level and relation-level attention mechanisms, and further, embeds the given entity in low dimensional hyperbolic space. Finally, every triple’s score is computed by a scoring function, and links in knowledge graphs are predicted based on the scores indicating the probabilities that predicted triples are correct. Experimental results show that, compared with baseline models, the proposed method can significantly improve the performance of link prediction in knowledge graphs.
  • loading
  • [1]
    王昊奋, 漆桂林, 陈华钧. 知识图谱: 方法、实践与应用[M]. 北京: 电子工业出版社, 2019: 1–3.

    WANG Haofen, QI Guilin, and CHEN Huajun. Knowledge Graph[M]. Beijing: Publishing House of Electronics History, 2019: 1–3.
    [2]
    REINANDA R, MEIJ E, and DE RIJKE M. Knowledge graphs: An information retrieval perspective[J]. Foundations and Trends® in Information Retrieval, 2020, 14(4): 289–444. doi: 10.1561/1500000063
    [3]
    张崇宇. 基于知识图谱的自动问答系统的应用研究与实现[D]. [硕士论文], 北京邮电大学, 2019.

    ZHANG Chongyu. Application research and implementation of automatic question answering system based on knowledge graph[D]. [Master dissertation], Beijing University of Posts and Telecommunications, 2019.
    [4]
    秦川, 祝恒书, 庄福振, 等. 基于知识图谱的推荐系统研究综述[J]. 中国科学:信息科学, 2020, 50(7): 937–956. doi: 10.1360/SSI-2019-0274

    QIN Chuan, ZHU Hengshu, ZHUANG Fuzhen, et al. A survey on knowledge graph-based recommender systems[J]. Scientia Sinica Informationis, 2020, 50(7): 937–956. doi: 10.1360/SSI-2019-0274
    [5]
    BRONSTEIN M M, BRUNA J, LECUN Y, et al. Geometric deep learning: Going beyond Euclidean data[J]. IEEE Signal Processing Magazine, 2017, 34(4): 18–42. doi: 10.1109/MSP.2017.2693418
    [6]
    WANG Quan, MAO Zhendong, WANG Bin, et al. Knowledge graph embedding: A survey of approaches and applications[J]. IEEE Transactions on Knowledge and Data Engineering, 2017, 29(12): 2724–2743. doi: 10.1109/TKDE.2017.2754499
    [7]
    BORDES A, USUNIER N, GARCIA-DURÁN A, et al. Translating Embeddings for modeling multi-relational data[C]. Proceedings of the 26th International Conference on Neural Information Processing Systems (NIPS), Lake Tahoe, USA, 2013: 2787–2795.
    [8]
    SUN Zhiqing, DENG Zhihong, NIE Jianyun, et al. RotatE: Knowledge graph embedding by relational rotation in complex space[EB/OL]. https://arxiv.org/abs/1902.10197, 2019.
    [9]
    NICKEL M, TRESP V, and KRIEGEL H. A three-way model for collective learning on multi-relational data[C]. Proceedings of the 28th International Conference on Machine Learning (ICML), Bellevue, USA, 2011: 809–816.
    [10]
    YANG Bishan, YIH W T, HE Xiaodong, et al. Embedding entities and relations for learning and inference in knowledge bases[EB/OL]. https://arxiv.org/abs/1412.6575, 2015.
    [11]
    TROUILLON T, WELBL J, RIEDEL S, et al. Complex embeddings for simple link prediction[C]. Proceedings of the 33rd International Conference on International Conference on Machine Learning (ICML), New York, USA, 2016: 2071–2080.
    [12]
    BALAZEVIC I, ALLEN C, and HOSPEDALES T. TuckER: Tensor factorization for knowledge graph completion[C]. Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), Hong Kong, China, 2019: 5184–5194.
    [13]
    DETTMERS T, MINERVINI P, STENETORP P, et al. Convolutional 2D knowledge graph embeddings[EB/OL]. https://arxiv.org/abs/1707.01476, 2018.
    [14]
    NGUYEN D Q, NGUYEN D Q, NGUYEN T D, et al. A convolutional neural network-based model for knowledge base completion and its application to search personalization[J]. Semantic Web, 2019, 10(5): 947–960. doi: 10.3233/SW-180318
    [15]
    SCHLICHTKRULL M, KIPF T N, BLOEM P, et al. Modeling relational data with graph convolutional networks[C]. 15th International Conference on The Semantic Web, Heraklion, Greece, 2018: 593–607.
    [16]
    ZHANG Zhao, ZHUANG Fuzhen, ZHU Hengshu, et al. Relational graph neural network with hierarchical attention for knowledge graph completion[C]. The AAAI Conference on Artificial Intelligence (AAAI), Palo Alto, USA, 2020: 9612–9619.
    [17]
    WU Zonghan, PAN Shirui, CHEN Fengwen, et al. A comprehensive survey on graph neural networks[J]. IEEE Transactions on Neural Networks and Learning Systems, 2021, 32(1): 4–24. doi: 10.1109/TNNLS.2020.2978386
    [18]
    王强, 江昊, 羿舒文, 等. 复杂网络的双曲空间表征学习方法[J]. 软件学报, 2021, 32(1): 93–117. doi: 10.13328/j.cnki.jos.006092

    WANG Qiang, JIANG Hao, YI Shuwen, et al. Hyperbolic representation learning for complex networks[J]. Journal of Software, 2021, 32(1): 93–117. doi: 10.13328/j.cnki.jos.006092
    [19]
    ZHANG Chengkun and GAO Junbin. Hype-HAN: Hyperbolic hierarchical attention network for semantic embedding[C]. Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence (IJCAI), Yokohama, Japan, 2020: 3990–3996.
    [20]
    BALAŽEVIC I, ALLEN C, and HOSPEDALES T. Multi-relational poincaré graph embeddings[EB/OL]. https://arxiv.org/abs/1905.09791, 2019.
    [21]
    CHAMI I, WOLF A, JUAN Dacheng, et al. Low-dimensional hyperbolic knowledge graph embeddings[C]. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics (ACL), Seattle, USA, 2020: 6901–6914.
    [22]
    BONNABEL S. Stochastic gradient descent on Riemannian manifolds[J]. IEEE Transactions on Automatic Control, 2013, 58(9): 2217–2229. doi: 10.1109/TAC.2013.2254619
    [23]
    SAKAI H and IIDUKA H. Riemannian adaptive optimization algorithm and its application to natural language processing[J]. IEEE Transactions on Cybernetics, 2021, 51(1): 1–12. doi: 10.1109/TCYB.2021.3049845.
    [24]
    ADCOCK A B, SULLIVAN B D, and MAHONEY M W. Tree-like structure in large social and information networks[C]. IEEE 13th International Conference on Data Mining, Dallas, USA, 2013: 1–10.
    [25]
    KRACKHARDT D. Computational Organization Theory[M]. London: Psychology Press, 1994: 89–111.
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(4)  / Tables(4)

    Article Metrics

    Article views (947) PDF downloads(199) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return