Continual Relation Extraction via Sequential Multi-Task Learning

Abstract

To build continual relation extraction (CRE) models, those can adapt to an ever-growing ontology of relations, is a cornerstone information extraction task that serves in various dynamic real-world domains. To mitigate catastrophic forgetting in CRE, existing state-of-the-art approaches have effectively utilized rehearsal techniques from continual learning and achieved remarkable success. However, managing multiple objectives associated with memory-based rehearsal remains underexplored, often relying on simple summation and overlooking complex trade-offs. In this paper, we propose Continual Relation Extraction via Sequential Multi-task Learning (CREST), a novel CRE approach built upon a tailored Multi-task Learning framework for continual learning. CREST takes into consideration the disparity in the magnitudes of gradient signals of different objectives, thereby effectively handling the inherent difference between multi-task learning and continual learning. Through extensive experiments on multiple datasets, CREST demonstrates significant improvements in CRE performance as well as superiority over other state-of-the-art Multi-task Learning frameworks, offering a promising solution to the challenges of continual learning in this domain.

Cite

Text

Le et al. "Continual Relation Extraction via Sequential Multi-Task Learning." AAAI Conference on Artificial Intelligence, 2024. doi:10.1609/AAAI.V38I16.29805

Markdown

[Le et al. "Continual Relation Extraction via Sequential Multi-Task Learning." AAAI Conference on Artificial Intelligence, 2024.](https://mlanthology.org/aaai/2024/le2024aaai-continual/) doi:10.1609/AAAI.V38I16.29805

BibTeX

@inproceedings{le2024aaai-continual,
  title     = {{Continual Relation Extraction via Sequential Multi-Task Learning}},
  author    = {Le, Thanh-Thien and Nguyen, Manh and Nguyen, Tung Thanh and Van Linh, Ngo and Nguyen, Thien Huu},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2024},
  pages     = {18444-18452},
  doi       = {10.1609/AAAI.V38I16.29805},
  url       = {https://mlanthology.org/aaai/2024/le2024aaai-continual/}
}