Journal of Computer Applications ›› 2023, Vol. 43 ›› Issue (2): 343-348.DOI: 10.11772/j.issn.1001-9081.2022010024

• Artificial intelligence • Previous Articles    

Temporal convolutional knowledge tracing model with attention mechanism

Xiaomeng SHAO, Meng ZHANG()   

  1. School of Computer Science,Central China Normal University,Wuhan Hubei 430079,China
  • Received:2022-01-10 Revised:2022-03-11 Accepted:2022-03-14 Online:2022-03-22 Published:2023-02-10
  • Contact: Meng ZHANG
  • About author:SHAO Xiaomeng, born in 1997, M. S. candidate. Her research interests include machine learning, educational data mining.
  • Supported by:
    Fundamental Research Funds for Central Universities(CCNU19TS020)


邵小萌, 张猛()   

  1. 华中师范大学 计算机学院,武汉 430079
  • 通讯作者: 张猛
  • 作者简介:邵小萌(1997—),女,河北沧州人,硕士研究生,CCF会员,主要研究方向:机器学习、教育数据挖掘
  • 基金资助:


To address the problems of insufficient interpretability and long sequence dependency in the deep knowledge tracing model based on Recurrent Neural Network (RNN), a model named Temporal Convolutional Knowledge Tracing with Attention mechanism (ATCKT) was proposed. Firstly, the student historical interactions embedded representations were learned in the training process. Then, the exercise problem-based attention mechanism was used to learn a specific weight matrix to identify and strengthen the influences of student historical interactions on the knowledge state at each moment. Finally, the student knowledge states were extracted by Temporal Convolutional Network (TCN), in which dilated convolution and deep neural network were used to expand the scope of sequence learning, and alleviate the problem of long sequence dependency. Experimental results show that compared with four models such as Deep Knowledge Tracing (DKT) and Convolutional Knowledge Tracing (CKT) on four datasets (ASSISTments2009、ASSISTments2015、Statics2011 and Synthetic-5), ATCKT model has the Area Under the Curve (AUC) and Accuracy (ACC) significantly improved, especially on ASSISTments2015 dataset, with an increase of 6.83 to 20.14 percentage points and 7.52 to 11.22 percentage points respectively, at the same time, the training time of the proposed model is decreased by 26% compared with that of DKT model. In summary, this model can accurately capture the student knowledge states and efficiently predict student future performance.

Key words: knowledge tracing, Temporal Convolutional Network (TCN), attention mechanism, sequence modeling, educational data mining



关键词: 知识追踪, 时间卷积网络, 注意力机制, 序列建模, 教育数据挖掘

CLC Number: