Journal of Computer Applications ›› 2023, Vol. 43 ›› Issue (2): 343-348.DOI: 10.11772/j.issn.1001-9081.2022010024

• Artificial intelligence • Previous Articles    

Temporal convolutional knowledge tracing model with attention mechanism

Xiaomeng SHAO, Meng ZHANG()   

  1. School of Computer Science,Central China Normal University,Wuhan Hubei 430079,China
  • Received:2022-01-10 Revised:2022-03-11 Accepted:2022-03-14 Online:2022-03-22 Published:2023-02-10
  • Contact: Meng ZHANG
  • About author:SHAO Xiaomeng, born in 1997, M. S. candidate. Her research interests include machine learning, educational data mining.
  • Supported by:
    Fundamental Research Funds for Central Universities(CCNU19TS020)

融合注意力机制的时间卷积知识追踪模型

邵小萌, 张猛()   

  1. 华中师范大学 计算机学院,武汉 430079
  • 通讯作者: 张猛
  • 作者简介:邵小萌(1997—),女,河北沧州人,硕士研究生,CCF会员,主要研究方向:机器学习、教育数据挖掘
  • 基金资助:
    中央高校科研业务费专项(CCNU19TS020)

Abstract:

To address the problems of insufficient interpretability and long sequence dependency in the deep knowledge tracing model based on Recurrent Neural Network (RNN), a model named Temporal Convolutional Knowledge Tracing with Attention mechanism (ATCKT) was proposed. Firstly, the student historical interactions embedded representations were learned in the training process. Then, the exercise problem-based attention mechanism was used to learn a specific weight matrix to identify and strengthen the influences of student historical interactions on the knowledge state at each moment. Finally, the student knowledge states were extracted by Temporal Convolutional Network (TCN), in which dilated convolution and deep neural network were used to expand the scope of sequence learning, and alleviate the problem of long sequence dependency. Experimental results show that compared with four models such as Deep Knowledge Tracing (DKT) and Convolutional Knowledge Tracing (CKT) on four datasets (ASSISTments2009、ASSISTments2015、Statics2011 and Synthetic-5), ATCKT model has the Area Under the Curve (AUC) and Accuracy (ACC) significantly improved, especially on ASSISTments2015 dataset, with an increase of 6.83 to 20.14 percentage points and 7.52 to 11.22 percentage points respectively, at the same time, the training time of the proposed model is decreased by 26% compared with that of DKT model. In summary, this model can accurately capture the student knowledge states and efficiently predict student future performance.

Key words: knowledge tracing, Temporal Convolutional Network (TCN), attention mechanism, sequence modeling, educational data mining

摘要:

针对基于循环神经网络(RNN)的深度知识追踪模型存在的可解释性不足和长序列依赖问题,提出一种融合注意力机制的时间卷积知识追踪(ATCKT)模型。首先,在训练阶段学习学生历史交互的嵌入表示;然后,使用基于题目的注意力机制学习特定权重矩阵,从而识别并强化学生的历史交互对每一时刻知识状态不同程度的影响;最后,使用时间卷积网络(TCN)提取学生动态变化的知识状态,在这个过程中利用扩张卷积和深层神经网络扩大序列学习范围,缓解长序列依赖问题。将ATCKT模型与深度知识追踪(DKT)、卷积知识追踪(CKT)等四种模型在ASSISTments2009、ASSISTments2015、Statics2011和Synthetic-5这4个数据集上进行对比实验,实验结果显示,所提模型的曲线下面积(AUC)和准确率(ACC)均有显著提升,尤其在ASSISTments2015数据集上表现最佳,分别提升了6.83~20.14个百分点和7.52~11.22个百分点,并且该模型的训练时间与DKT模型相比减少了26%。可见,所提模型可以更准确地捕捉学生的知识状态,更高效地预测学生未来的表现。

关键词: 知识追踪, 时间卷积网络, 注意力机制, 序列建模, 教育数据挖掘

CLC Number: