《计算机应用》唯一官方网站 ›› 2026, Vol. 46 ›› Issue (4): 1096-1103.DOI: 10.11772/j.issn.1001-9081.2025040497

• 人工智能 • 上一篇    下一篇

融合双向序列嵌入的复杂查询问答模型

梁豪1, 乔少杰2()   

  1. 1.宁波工程学院 大数据处,浙江 宁波 315211
    2.成都信息工程大学 软件工程学院,成都 610225
  • 收稿日期:2025-05-04 修回日期:2025-08-22 接受日期:2025-08-28 发布日期:2025-09-01 出版日期:2026-04-10
  • 通讯作者: 乔少杰
  • 作者简介:梁豪(1976—),男,浙江宁波人,讲师,硕士,主要研究方向:大数据、人工智能、计算机网络安全
  • 基金资助:
    国家自然科学基金资助项目(62272066);四川省科技计划项目(2025ZNSFSC0044);四川省科技计划项目(2025YFHZ0194)

Complex query-based question-answering model integrating bidirectional sequence embeddings

Hao LIANG1, Shaojie QIAO2()   

  1. 1.Office of Big Data,Ningbo University of Technology,Ningbo Zhejiang 315211,China
    2.School of Software Engineering,Chengdu University of Information Technology,Chengdu Sichuan 610225,China
  • Received:2025-05-04 Revised:2025-08-22 Accepted:2025-08-28 Online:2025-09-01 Published:2026-04-10
  • Contact: Shaojie QIAO
  • About author:LIANG Hao, born in 1976, M. S., lecturer. His research interests include big data, artificial intelligence, computer network security.
  • Supported by:
    National Natural Science Foundation of China(62272066);Sichuan Science and Technology Program(2025ZNSFSC0044)

摘要:

传统知识图谱(KG)嵌入方法主要聚焦于简单三元组的链接预测,它的“头实体?关系?尾实体”的建模范式在处理包含多个未知变量的合取查询时存在显著局限性。针对上述问题,提出融合双向序列嵌入(BSE)的复杂查询问答模型。首先,基于双向Transformer架构构建查询编码器,将查询结构转换为序列化表示;其次,利用位置编码保留图结构信息;再次,通过加法注意力机制(AAM)动态建模查询图中所有元素的深层语义关联;最后,实现跨节点的全局信息交互,克服传统方法在长距离依赖建模方面的缺陷。在不同基准数据集上进行实验,验证BSE模型的性能优势。实验结果表明,在WN18RR-PATHS数据集上,与GQE-DistMult-MP相比,BSE模型的平均倒数排名(MRR)指标提高了53.01%;在EDUKG数据集上,与GQE-Bilinear相比,BSE模型的曲线下面积(AUC)指标提高了6.09%。综上所述,所提模型可用于不同领域的查询问答,并且具有较高扩展性与应用价值。

关键词: 知识图谱, 双向序列, 语义关联, 长距离依赖, GQE-DistMult-MP

Abstract:

Traditional Knowledge Graph (KG) embedding methods mainly focus on link prediction for simple triples, and their modeling paradigm of “head entity-relation-tail entity” have significant limitations in handling conjunctive queries containing multiple unknown variables. To address the above issues, a complex query-based question-answering model integrating Bidirectional Sequence Embedding (BSE) was proposed. Firstly, a query encoder was constructed on the basis of a bidirectional Transformer architecture to convert the query structure into a serialized representation. Secondly, positional encoding was utilized to preserve graph structure information. Thirdly, the deep semantic associations among all elements in the query graph were modeled dynamically through Additive Attention Mechanism (AAM). Finally, global information interaction across nodes was realized, and the shortcomings of traditional methods in modeling long-distance dependencies were addressed effectively. Experiments were conducted on different benchmark datasets to verify the performance advantages of BSE model. The experimental results show that on the WN18RR-PATHS dataset, compared with GQE-DistMult-MP, BSE model achieves a 53.01% improvement in the Mean Reciprocal Rank (MRR) metric; on the EDUKG dataset, BSE model outperforms GQE-Bilinear with a 6.09% increase in the Area Under the Curve (AUC) metric. To sum up, the proposed model can be applied to query-based question-answering in different fields, and has high scalability and application value.

Key words: Knowledge Graph (KG), bidirectional sequence, semantic association, long-distance dependency, GQE-DistMult-MP

中图分类号: