Quanlong Guan


2025

pdf bib
KVFKT: A New Horizon in Knowledge Tracing with Attention-Based Embedding and Forgetting Curve Integration
Quanlong Guan | Xiuliang Duan | Kaiquan Bian | Guanliang Chen | Jianbo Huang | Zhiguo Gong | Liangda Fang
Proceedings of the 31st International Conference on Computational Linguistics

The knowledge tracing (KT) model based on deep learning has been proven to be superior to the traditional knowledge tracing model, eliminating the need for artificial engineering features. However, there are still problems, such as insufficient interpretability of the learning and answering processes. To address these issues, we propose a new approach in knowledge tracing with attention-based embedding and forgetting curve integration, namely KVFKT. Firstly, the embedding representation module is responsible for embedding the questions and computing the attention vector of knowledge concepts (KCs) when students answer questions and when answer time stamps are collected. Secondly, the forgetting quantification module performs the pre-prediction update of the student’s knowledge state matrix. This quantification involves calculating the interval time and associated forgetting rate of relevant KCs, following the forgetting curve. Thirdly, the answer prediction module generates responses based on students’ knowledge status, guess coefficient, and question difficulty. Finally, the knowledge status update module further refines the students’ knowledge status according to their answers to the questions and the characteristics of those questions. In the experiment, four real-world datasets are used to test the model. Experimental results show that KVFKT better traces students’ knowledge state and outperforms state-of-the-art models.