An Iterative Multi-Source Mutual Knowledge Transfer Framework for Machine Reading Comprehension

Abstract

The lack of sufficient training data in many domains, poses a major challenge to the construction of domain-specific machine reading comprehension (MRC) models with satisfying performance. In this paper, we propose a novel iterative multi-source mutual knowledge transfer framework for MRC. As an extension of the conventional knowledge transfer with one-to-one correspondence, our framework focuses on the many-to-many mutual transfer, which involves synchronous executions of multiple many-to-one transfers in an iterative manner.Specifically, to update a target-domain MRC model, we first consider other domain-specific MRC models as individual teachers, and employ knowledge distillation to train a multi-domain MRC model, which is differentially required to fit the training data and match the outputs of these individual models according to their domain-level similarities to the target domain. After being initialized by the multi-domain MRC model, the target-domain MRC model is fine-tuned to match both its training data and the output of its previous best model simultaneously via knowledge distillation. Compared with previous approaches, our framework can continuously enhance all domain-specific MRC models by enabling each model to iteratively and differentially absorb the domain-shared knowledge from others. Experimental results and in-depth analyses on several benchmark datasets demonstrate the effectiveness of our framework.

Cite

Text

Liu et al. "An Iterative Multi-Source Mutual Knowledge Transfer Framework for Machine Reading Comprehension." International Joint Conference on Artificial Intelligence, 2020. doi:10.24963/IJCAI.2020/525

Markdown

[Liu et al. "An Iterative Multi-Source Mutual Knowledge Transfer Framework for Machine Reading Comprehension." International Joint Conference on Artificial Intelligence, 2020.](https://mlanthology.org/ijcai/2020/liu2020ijcai-iterative/) doi:10.24963/IJCAI.2020/525

BibTeX

@inproceedings{liu2020ijcai-iterative,
  title     = {{An Iterative Multi-Source Mutual Knowledge Transfer Framework for Machine Reading Comprehension}},
  author    = {Liu, Xin and Liu, Kai and Li, Xiang and Su, Jinsong and Ge, Yubin and Wang, Bin and Luo, Jiebo},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2020},
  pages     = {3794-3800},
  doi       = {10.24963/IJCAI.2020/525},
  url       = {https://mlanthology.org/ijcai/2020/liu2020ijcai-iterative/}
}