Decoupling Breaks Data Barriers: A Decoupled Pre-Training Framework for Multi-Intent Spoken Language Understanding
Abstract
Exemplar-free class incremental learning (EF-CIL) is a nontrivial task that requires continuously enriching model capability with new classes while maintaining previously learned knowledge without storing and replaying any old class exemplars. An emerging theory-guided framework for CIL trains task-specific models for a shared network, shifting the pressure of forgetting to task-id prediction. In EF-CIL, task-id prediction is more challenging due to the lack of inter-task interaction (e.g., replays of exemplars). To address this issue, we conduct a theoretical analysis of the importance and feasibility of preserving a discriminative and consistent feature space, upon which we propose a novel method termed DCNet. Concretely, it progressively maps class representations into a hyperspherical space, in which different classes are orthogonally distributed to achieve ample inter-class separation. Meanwhile, it also introduces compensatory training to adaptively adjust supervision intensity, thereby aligning the degree of intra-class aggregation. Extensive experiments and theoretical analysis verified the superiority of DCNet. Code is available at https://github.com/Tianqi-Wang1/DCNet.
Cite
Text
Qin et al. "Decoupling Breaks Data Barriers: A Decoupled Pre-Training Framework for Multi-Intent Spoken Language Understanding." International Joint Conference on Artificial Intelligence, 2024. doi:10.24963/ijcai.2024/715Markdown
[Qin et al. "Decoupling Breaks Data Barriers: A Decoupled Pre-Training Framework for Multi-Intent Spoken Language Understanding." International Joint Conference on Artificial Intelligence, 2024.](https://mlanthology.org/ijcai/2024/qin2024ijcai-decoupling/) doi:10.24963/ijcai.2024/715BibTeX
@inproceedings{qin2024ijcai-decoupling,
title = {{Decoupling Breaks Data Barriers: A Decoupled Pre-Training Framework for Multi-Intent Spoken Language Understanding}},
author = {Qin, Libo and Chen, Qiguang and Zhou, Jingxuan and Li, Qinzheng and Lu, Chunlin and Che, Wanxiang},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2024},
pages = {6469-6477},
doi = {10.24963/ijcai.2024/715},
url = {https://mlanthology.org/ijcai/2024/qin2024ijcai-decoupling/}
}