《计算机应用》唯一官方网站 ›› 2022, Vol. 42 ›› Issue (7): 1985-1992.DOI: 10.11772/j.issn.1001-9081.2021050764
所属专题: 人工智能
陈恒1,2(), 王思懿1, 李正光1, 李冠宇2, 刘鑫1
收稿日期:
2021-05-12
修回日期:
2021-09-15
接受日期:
2021-09-22
发布日期:
2021-09-15
出版日期:
2022-07-10
通讯作者:
陈恒
作者简介:
王思懿(1998—),女(满),辽宁瓦房店人,硕士研究生,主要研究方向:机器学习、知识图谱基金资助:
Heng CHEN1,2(), Siyi WANG1, Zhengguang LI1, Guanyu LI2, Xin LIU1
Received:
2021-05-12
Revised:
2021-09-15
Accepted:
2021-09-22
Online:
2021-09-15
Published:
2022-07-10
Contact:
Heng CHEN
About author:
CHEN Heng, born in 1982, Ph. D. candidate, associate professor. His research interests include machine learning, knowledge completion.Supported by:
摘要:
作为一种语义知识库,知识图谱(KG)使用结构化三元组的形式存储真实世界的实体及其内在关系。为了推理知识图谱中缺失的真实三元组,考虑关系记忆网络较强的三元组表征能力和胶囊网络强大的特征处理能力,提出一种基于关系记忆的胶囊网络知识图谱嵌入模型。首先,通过编码实体和关系之间的潜在依赖关系和部分重要信息形成编码嵌入向量;然后,把嵌入向量与过滤器卷积以生成不同的特征图,再重组为对应的胶囊;最后,通过压缩函数和动态路由指定从父胶囊到子胶囊的连接,并根据子胶囊与权重内积的得分判断当前三元组的可信度。链接预测实验的结果表明,与CapsE模型相比,在倒数平均排名(MRR)和Hit@10评价指标上,所提模型在WN18RR数据集上分别提高了7.95%和2.2个百分点,在FB15K-237数据集上分别提高了3.82%和2个百分点。实验结果表明,所提模型可以更准确地推断出头实体和尾实体之间的关系。
中图分类号:
陈恒, 王思懿, 李正光, 李冠宇, 刘鑫. 基于关系记忆的胶囊网络知识图谱嵌入模型[J]. 计算机应用, 2022, 42(7): 1985-1992.
Heng CHEN, Siyi WANG, Zhengguang LI, Guanyu LI, Xin LIU. Capsule network knowledge graph embedding model based on relational memory[J]. Journal of Computer Applications, 2022, 42(7): 1985-1992.
Dataset | #En | #Re | #Va | #Tr | #Te |
---|---|---|---|---|---|
WN18RR | 40 943 | 11 | 3 034 | 86 835 | 3 134 |
FB15K-237 | 14 541 | 237 | 17 535 | 272 115 | 20 466 |
FB13 | 75 043 | 13 | 11 816 | 316 232 | 47 466 |
WN11 | 38 696 | 11 | 5 218 | 112 581 | 21 088 |
表1 数据集统计信息
Tab. 1 Dataset statistics
Dataset | #En | #Re | #Va | #Tr | #Te |
---|---|---|---|---|---|
WN18RR | 40 943 | 11 | 3 034 | 86 835 | 3 134 |
FB15K-237 | 14 541 | 237 | 17 535 | 272 115 | 20 466 |
FB13 | 75 043 | 13 | 11 816 | 316 232 | 47 466 |
WN11 | 38 696 | 11 | 5 218 | 112 581 | 21 088 |
模型 | FB15K-237 | WN18RR | ||||||||
---|---|---|---|---|---|---|---|---|---|---|
MR | MRR | Hit@1 | Hit@3 | Hit@10 | MR | MRR | Hit@1 | Hit@3 | Hit@10 | |
TransE | 357 | 0.294 | — | — | 0.465 | 3 384 | 0.226 | — | — | 0.501 |
DistMult | 254 | 0.241 | 0.155 | 0.263 | 0.419 | 5 110 | 0.430 | 0.390 | 0.440 | 0.490 |
ComplEx | 339 | 0.247 | 0.158 | 0.275 | 0.428 | 5 261 | 0.440 | 0.410 | 0.460 | 0.510 |
ConvE | 244 | 0.325 | 0.237 | 0.356 | 0.501 | 4 187 | 0.430 | 0.400 | 0.440 | 0.520 |
ConvKB | 254 | 0.418 | — | — | 0.532 | 763 | 0.253 | — | — | 0.567 |
RotatE[ | 177 | 0.338 | 0.241 | 0.375 | 0.533 | 3 340 | 0.476 | 0.428 | 0.492 | 0.571 |
CapsE | 303 | 0.523 | 0.478 | — | 0.593 | 719 | 0.415 | 0.337 | — | 0.560 |
本文模型 | 324 | 0.543 | 0.436 | 0.385 | 0.613 | 706 | 0.448 | 0.403 | 0.512 | 0.582 |
表2 数据集WN18RR和FB15K-237上的链接预测结果
Tab. 2 Link prediction results on datasets WN18RR and FB15K-237
模型 | FB15K-237 | WN18RR | ||||||||
---|---|---|---|---|---|---|---|---|---|---|
MR | MRR | Hit@1 | Hit@3 | Hit@10 | MR | MRR | Hit@1 | Hit@3 | Hit@10 | |
TransE | 357 | 0.294 | — | — | 0.465 | 3 384 | 0.226 | — | — | 0.501 |
DistMult | 254 | 0.241 | 0.155 | 0.263 | 0.419 | 5 110 | 0.430 | 0.390 | 0.440 | 0.490 |
ComplEx | 339 | 0.247 | 0.158 | 0.275 | 0.428 | 5 261 | 0.440 | 0.410 | 0.460 | 0.510 |
ConvE | 244 | 0.325 | 0.237 | 0.356 | 0.501 | 4 187 | 0.430 | 0.400 | 0.440 | 0.520 |
ConvKB | 254 | 0.418 | — | — | 0.532 | 763 | 0.253 | — | — | 0.567 |
RotatE[ | 177 | 0.338 | 0.241 | 0.375 | 0.533 | 3 340 | 0.476 | 0.428 | 0.492 | 0.571 |
CapsE | 303 | 0.523 | 0.478 | — | 0.593 | 719 | 0.415 | 0.337 | — | 0.560 |
本文模型 | 324 | 0.543 | 0.436 | 0.385 | 0.613 | 706 | 0.448 | 0.403 | 0.512 | 0.582 |
模型 | WN11 | FB13 |
---|---|---|
TransE | 89.2 | 88.1 |
TransH | 78.8 | 83.3 |
TransR | 85.9 | 82.5 |
TransD[ | 86.4 | 89.1 |
TranSparse-S[ | 86.4 | 88.2 |
TranSparse-US[ | 86.8 | 87.5 |
TransG[ | 87.4 | 87.3 |
ConvKB | 87.6 | 88.8 |
R-MeN | 90.5 | 88.9 |
本文模型 | 91.5 | 87.5 |
表3 数据集WN11和FB13上的三元组分类结果 (%)
Tab. 3 Triple classification results on datasets WN11 and FB13
模型 | WN11 | FB13 |
---|---|---|
TransE | 89.2 | 88.1 |
TransH | 78.8 | 83.3 |
TransR | 85.9 | 82.5 |
TransD[ | 86.4 | 89.1 |
TranSparse-S[ | 86.4 | 88.2 |
TranSparse-US[ | 86.8 | 87.5 |
TransG[ | 87.4 | 87.3 |
ConvKB | 87.6 | 88.8 |
R-MeN | 90.5 | 88.9 |
本文模型 | 91.5 | 87.5 |
1 | 刘知远,孙茂松,林衍凯,等. 知识表示学习研究进展[J]. 计算机研究与发展, 2016, 53(2): 247-261. |
LIU Z Y, SUN M S, LIN Y K, et al. Knowledge representation learning: a review[J]. Journal of Computer Research and Development, 2016, 53(2): 247-261. | |
2 | SUCHANEK F M, KASNECI G, WEIKUM G. YAGO: a core of semantic knowledge[C]// Proceedings of the 16th International Conference on World Wide Web. New York: ACM, 2007: 697-706. 10.1145/1242572.1242667 |
3 | BOLLACKER K, EVANS C, PARITOSH P, et al. Freebase: a collaboratively created graph database for structuring human knowledge[C]// Proceedings of the 2008 ACM SIGMOD International Conference on Management of Data. New York: ACM, 2008: 1247-1250. 10.1145/1376616.1376746 |
4 | LEHMANN J, ISELE R, JAKOB M, et al. DBpedia — a large-scale, multilingual knowledge base extracted from Wikipedia[J]. Semantic Web, 2015, 6(2): 167-195. 10.3233/sw-140134 |
5 | ZHANG F Z, YUAN N J, LIAN D F, et al. Collaborative knowledge base embedding for recommender systems[C]// Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York: ACM, 2016: 353-362. 10.1145/2939672.2939673 |
6 | HAO Y C, ZHANG Y Z, LIU K, et al. An end-to-end model for question answering over knowledge base with cross-attention combining global knowledge[C]// Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Stroudsburg, PA: Association for Computational Linguistics, 2017: 221-231. 10.18653/v1/p17-1021 |
7 | XIONG C Y, POWER R, CALLAN J. Explicit semantic ranking for academic search via knowledge graph embedding[C]// Proceedings of the 26th International Conference on World Wide Web. Republic and Canton of Geneva: International World Wide Web Conferences Steering Committee, 2017: 1271-1279. 10.1145/3038912.3052558 |
8 | YANG B S, MITCHELL T. Leveraging knowledge bases in LSTMs for improving machine reading[C]// Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Stroudsburg, PA: Association for Computational Linguistics, 2017: 1436-1446. 10.18653/v1/p17-1132 |
9 | SOCHER R, CHEN D Q, MANNING C D, et al. Reasoning with neural tensor networks for knowledge base completion[C]// Proceedings of the 26th International Conference on Neural Information Processing Systems. Red Hook, NY: Curran Associates Inc., 2013: 926-934. |
10 | WEST R, GABRILOVICH E, MURPHY K, et al. Knowledge base completion via search-based question answering[C]// Proceedings of the 23rd International Conference on World Wide Web. New York: ACM, 2014: 515-526. 10.1145/2566486.2568032 |
11 | CHEN H, WANG W W, LI G Y, et al. A quaternion-embedded capsule network model for knowledge graph completion[J]. IEEE Access, 2020, 8: 100890-100904. 10.1109/access.2020.2997177 |
12 | ZHANG Z Q, CAI J Y, ZHANG Y D, et al. Learning hierarchy-aware knowledge graph embeddings for link prediction[C]// Proceedings of the 34th AAAI Conference on Artificial Intelligence. Palo Alto, CA: AAAI Press, 2020: 3065-3072. 10.1609/aaai.v34i03.5701 |
13 | BORDES A, USUNIER N, GARCIA-DURÁN A, et al. Translating embeddings for modeling multi-relational data[C]// Proceedings of the 26th International Conference on Neural Information Processing Systems. Red Hook, NY: Curran Associates Inc., 2013: 2787-2795. |
14 | WANG Z, ZHANG J W, FENG J L, et al. Knowledge graph embedding by translating on hyperplanes[C]// Proceedings of the 28th AAAI Conference on Artificial Intelligence. Palo Alto, CA: AAAI Press, 2014: 1112-1119. 10.1609/aaai.v28i1.8870 |
15 | LIN Y K, LIU Z Y, SUN M S, et al. Learning entity and relation embeddings for knowledge graph completion[C]// Proceedings of the 29th AAAI Conference on Artificial Intelligence. Palo Alto, CA: AAAI Press, 2015: 2181-2187. 10.1609/aaai.v29i1.9491 |
16 | YANG B S, YIH W T, HE X D, et al. Embedding entities and relations for learning and inference in knowledge bases[EB/OL]. (2015-08-29) [2021-09-04].. |
17 | TROUILLON T, WELBL J, RIEDEL S, et al. Complex embeddings for simple link prediction[C]// Proceedings of the 33rd International Conference on Machine Learning. New York: JMLR.org, 2016: 2071-2080. |
18 | DETTMERS T, MINERVINI P, STENETORP P, et al. Convolutional 2D knowledge graph embeddings[C]// Proceedings of the 32nd AAAI Conference on Artificial Intelligence. Palo Alto, CA: AAAI Press, 2018: 1811-1818. 10.1609/aaai.v32i1.11573 |
19 | NGUYEN D Q, NGUYEN T D, NGUYEN D Q, et al. A novel embedding model for knowledge base completion based on convolutional neural network[C]// Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 2: Short Papers). Stroudsburg, PA: Association for Computational Linguistics, 2018: 327-333. 10.18653/v1/n18-2053 |
20 | NGUYEN D Q, VU T, NGUYEN T D, et al. A capsule network-based embedding model for knowledge graph completion and search personalization[C]// Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long and Short Papers). Stroudsburg, PA: Association for Computational Linguistics, 2019: 2180-2189. 10.18653/v1/n19-1226 |
21 | NGUYEN D Q, NGUYEN T D, PHUNG D. A relational memory-based embedding model for triple classification and search personalization[C]// Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, PA: Association for Computational Linguistics, 2020: 3429-3435. 10.18653/v1/2020.acl-main.313 |
22 | KIM Y, LEE H, JUNG K. AttnConvnet at SemEval-2018 task 1: attention-based convolutional neural networks for multi-label emotion classification[C]// Proceedings of the 12th International Workshop on Semantic Evaluation. Stroudsburg, PA: Association for Computational Linguistics, 2018: 141-145. 10.18653/v1/s18-1019 |
23 | 贾旭东,王莉. 基于多头注意力胶囊网络的文本分类模型[J]. 清华大学学报(自然科学版), 2020, 60(5):415-421. 10.7717/peerj-cs.831 |
JIA X D, WANG L. Text classification model based on multi-head attention capsule networks[J]. Journal of Tsinghua University (Science and Technology), 2020, 60(5): 415-421. 10.7717/peerj-cs.831 | |
24 | 陈恒,李冠宇,祁瑞华,等. 胶囊网络在知识图谱补全中的应用[J]. 计算机工程与应用, 2020, 56(8):110-116. |
CHEN H, LI G Y, QI R H, et al. Capsule Network's application in knowledge graph completion[J]. Computer Engineering and Applications, 2020, 56(8): 110-116. | |
25 | VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. Red Hook, NY: Curran Associates Inc., 2017: 6000-6010. 10.1016/s0262-4079(17)32358-8 |
26 | SANTORO A, FAULKNER R, RAPOSO D, et al. Relational recurrent neural networks[C]// Proceedings of the 32nd International Conference on Neural Information Processing Systems. Red Hook, NY: Curran Associates Inc., 2018: 7310-7321. |
27 | KINGMA D P, BA J L. Adam: a method for stochastic optimization[EB/OL]. (2017-01-30) [2021-09-04].. |
28 | TOUTANOVA K, CHEN D Q. Observed versus latent features for knowledge base and text inference[C]// Proceedings of the 3rd Workshop on Continuous Vector Space Models and their Compositionality. Stroudsburg, PA: Association for Computational Linguistics, 2015: 57-66. 10.18653/v1/w15-4007 |
29 | SUN Z Q, DENG Z H, NIE J Y, et al. RotatE: knowledge graph embedding by relational rotation in complex space[EB/OL]. (2019-02-26) [2021-09-04].. |
30 | JI G L, HE S Z, XU L H, et al. Knowledge graph embedding via dynamic mapping matrix[C]// Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). Stroudsburg, PA: Association for Computational Linguistics, 2015: 687-696. 10.3115/v1/p15-1067 |
31 | JI G L, LIU K, HE S Z, et al. Knowledge graph completion with adaptive sparse transfer matrix[C]// Proceedings of the 30th AAAI Conference on Artificial Intelligence. Palo Alto, CA: AAAI Press, 2016: 985-991. 10.1609/aaai.v30i1.10089 |
32 | XIAO H, HUANG M L, ZHU X Y. TransG: a generative mixture model for knowledge graph embedding[C]// Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Stroudsburg, PA: Association for Computational Linguistics, 2016: 2316-2325. 10.18653/v1/p16-1219 |
[1] | 武杰, 张安思, 吴茂东, 张仪宗, 王从宝. 知识图谱在装备故障诊断领域的研究与应用综述[J]. 《计算机应用》唯一官方网站, 2024, 44(9): 2651-2659. |
[2] | 薛桂香, 王辉, 周卫峰, 刘瑜, 李岩. 基于知识图谱和时空扩散图卷积网络的港口交通流量预测[J]. 《计算机应用》唯一官方网站, 2024, 44(9): 2952-2957. |
[3] | 赵宇博, 张丽萍, 闫盛, 侯敏, 高茂. 基于改进分段卷积神经网络和知识蒸馏的学科知识实体间关系抽取[J]. 《计算机应用》唯一官方网站, 2024, 44(8): 2421-2429. |
[4] | 于右任, 张仰森, 蒋玉茹, 黄改娟. 融合多粒度语言知识与层级信息的中文命名实体识别模型[J]. 《计算机应用》唯一官方网站, 2024, 44(6): 1706-1712. |
[5] | 李健京, 李贯峰, 秦飞舟, 李卫军. 基于不确定知识图谱嵌入的多关系近似推理模型[J]. 《计算机应用》唯一官方网站, 2024, 44(6): 1751-1759. |
[6] | 赵晓焱, 匡燕, 王梦含, 袁培燕. 基于知识图谱的端到端内容共享机制[J]. 《计算机应用》唯一官方网站, 2024, 44(4): 995-1001. |
[7] | 郭洁, 林佳瑜, 梁祖红, 罗孝波, 孙海涛. 基于知识感知和跨层次对比学习的推荐方法[J]. 《计算机应用》唯一官方网站, 2024, 44(4): 1121-1127. |
[8] | 王利琴, 张特, 许智宏, 董永峰, 杨国伟. 融合实体语义及结构信息的知识图谱推理[J]. 《计算机应用》唯一官方网站, 2024, 44(11): 3371-3378. |
[9] | 蒋汶娟, 过弋, 付娇娇. 融合图注意力的复杂时序知识图谱推理问答模型[J]. 《计算机应用》唯一官方网站, 2024, 44(10): 3047-3057. |
[10] | 周北京, 王海荣, 王怡梦, 张丽丝, 马赫. 图谱嵌入传播的推荐方法[J]. 《计算机应用》唯一官方网站, 2024, 44(10): 3252-3259. |
[11] | 王红斌, 房晓, 江虹. 融入三维语义特征的常识推理问答方法[J]. 《计算机应用》唯一官方网站, 2024, 44(1): 138-144. |
[12] | 王春雷, 王肖, 刘凯. 多模态知识图谱表示学习综述[J]. 《计算机应用》唯一官方网站, 2024, 44(1): 1-15. |
[13] | 衡红军, 杨鼎诚. 知识增强的方面词交互图神经网络[J]. 《计算机应用》唯一官方网站, 2023, 43(8): 2412-2419. |
[14] | 樊海玮, 鲁芯丝雨, 张丽苗, 安毅生. 融合知识图谱和图注意力网络的引文推荐算法[J]. 《计算机应用》唯一官方网站, 2023, 43(8): 2420-2425. |
[15] | 陈克正, 郭晓然, 钟勇, 李振平. 基于负训练和迁移学习的关系抽取方法[J]. 《计算机应用》唯一官方网站, 2023, 43(8): 2426-2430. |
阅读次数 | ||||||
全文 |
|
|||||
摘要 |
|
|||||