Traditional Knowledge Graph (KG) embedding methods mainly focus on link prediction for simple triples, and their modeling paradigm of “head entity-relation-tail entity” have significant limitations in handling conjunctive queries containing multiple unknown variables. To address the above issues, a complex query-based question-answering model integrating Bidirectional Sequence Embedding (BSE) was proposed. Firstly, a query encoder was constructed on the basis of a bidirectional Transformer architecture to convert the query structure into a serialized representation. Secondly, positional encoding was utilized to preserve graph structure information. Thirdly, the deep semantic associations among all elements in the query graph were modeled dynamically through Additive Attention Mechanism (AAM). Finally, global information interaction across nodes was realized, and the shortcomings of traditional methods in modeling long-distance dependencies were addressed effectively. Experiments were conducted on different benchmark datasets to verify the performance advantages of BSE model. The experimental results show that on the WN18RR-PATHS dataset, compared with GQE-DistMult-MP, BSE model achieves a 53.01% improvement in the Mean Reciprocal Rank (MRR) metric; on the EDUKG dataset, BSE model outperforms GQE-Bilinear with a 6.09% increase in the Area Under the Curve (AUC) metric. To sum up, the proposed model can be applied to query-based question-answering in different fields, and has high scalability and application value.