Co-Attention and Contrastive Learning Driven Knowledge Tracing

Abstract

Knowledge Tracing (KT) aims to trace the dynamic knowledge state of learners during the learning process. It has become a critical task in improving intelligent education system services. However, most existing KT models oversimplify the modeling of the knowledge state, ignoring two practical issues in the learning process: (1) the mutual influence of learning ability and knowledge mastery and (2) the process of knowledge internalization. This paper proposes a novel Co-attention and Contrastive Learning Driven Knowledge Tracing (CCKT) architecture, achieving a detailed cognitive process model based on pedagogical theories. Specifically, we first propose the Ability and Knowledge Co-attention Layer to simulate the mutual influence between learning ability and knowledge mastery, enabling collaborative attention. Then, we design a Knowledge Internalization Encoder to promote the transformation of understanding-level knowledge mastery into application-level knowledge mastery. Meanwhile, we further design top-down attention signals to guide the knowledge internalization process. Finally, using contrastive learning enriches data representation and improves the reliability of knowledge state tracing. Extensive experimental evaluations on four real public benchmark datasets demonstrate the superior performance of our approach compared to several existing methods.

Cite

Text

Zheng and Shan. "Co-Attention and Contrastive Learning Driven Knowledge Tracing." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2024. doi:10.1007/978-3-031-70362-1_11

Markdown

[Zheng and Shan. "Co-Attention and Contrastive Learning Driven Knowledge Tracing." European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, 2024.](https://mlanthology.org/ecmlpkdd/2024/zheng2024ecmlpkdd-coattention/) doi:10.1007/978-3-031-70362-1_11

BibTeX

@inproceedings{zheng2024ecmlpkdd-coattention,
  title     = {{Co-Attention and Contrastive Learning Driven Knowledge Tracing}},
  author    = {Zheng, Ning and Shan, Zhilong},
  booktitle = {European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases},
  year      = {2024},
  pages     = {177-194},
  doi       = {10.1007/978-3-031-70362-1_11},
  url       = {https://mlanthology.org/ecmlpkdd/2024/zheng2024ecmlpkdd-coattention/}
}