Class Incremental Learning for Task-Oriented Dialogue System with Contrastive Distillation on Internal Representations (Student Abstract)

Abstract

The ability to continually learn over time by grasping new knowledge and remembering previously learned experiences is essential for developing an online task-oriented dialogue system (TDS). In this paper, we work on the class incremental learning scenario where the TDS is evaluated without specifying the dialogue domain. We employ contrastive distillation on the intermediate representations of dialogues to learn transferable representations that suffer less from catastrophic forgetting. Besides, we provide a dynamic update mechanism to explicitly preserve the learned experiences by only updating the parameters related to the new task while keeping other parameters fixed. Extensive experiments demonstrate that our method significantly outperforms the strong baselines.

Cite

Text

Xu et al. "Class Incremental Learning for Task-Oriented Dialogue System with Contrastive Distillation on Internal Representations (Student Abstract)." AAAI Conference on Artificial Intelligence, 2023. doi:10.1609/AAAI.V37I13.27044

Markdown

[Xu et al. "Class Incremental Learning for Task-Oriented Dialogue System with Contrastive Distillation on Internal Representations (Student Abstract)." AAAI Conference on Artificial Intelligence, 2023.](https://mlanthology.org/aaai/2023/xu2023aaai-class/) doi:10.1609/AAAI.V37I13.27044

BibTeX

@inproceedings{xu2023aaai-class,
  title     = {{Class Incremental Learning for Task-Oriented Dialogue System with Contrastive Distillation on Internal Representations (Student Abstract)}},
  author    = {Xu, Qiancheng and Yang, Min and Geng, Binzong},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2023},
  pages     = {16368-16369},
  doi       = {10.1609/AAAI.V37I13.27044},
  url       = {https://mlanthology.org/aaai/2023/xu2023aaai-class/}
}