Better Fine-Tuning via Instance Weighting for Text Classification
Abstract
Transfer learning for deep neural networks has achieved great success in many text classification applications. A simple yet effective transfer learning method is to fine-tune the pretrained model parameters. Previous fine-tuning works mainly focus on the pre-training stage and investigate how to pretrain a set of parameters that can help the target task most. In this paper, we propose an Instance Weighting based Finetuning (IW-Fit) method, which revises the fine-tuning stage to improve the final performance on the target domain. IW-Fit adjusts instance weights at each fine-tuning epoch dynamically to accomplish two goals: 1) identify and learn the specific knowledge of the target domain effectively; 2) well preserve the shared knowledge between the source and the target domains. The designed instance weighting metrics used in IW-Fit are model-agnostic, which are easy to implement for general DNN-based classifiers. Experimental results show that IW-Fit can consistently improve the classification accuracy on the target domain.
Cite
Text
Wang et al. "Better Fine-Tuning via Instance Weighting for Text Classification." AAAI Conference on Artificial Intelligence, 2019. doi:10.1609/AAAI.V33I01.33017241Markdown
[Wang et al. "Better Fine-Tuning via Instance Weighting for Text Classification." AAAI Conference on Artificial Intelligence, 2019.](https://mlanthology.org/aaai/2019/wang2019aaai-better/) doi:10.1609/AAAI.V33I01.33017241BibTeX
@inproceedings{wang2019aaai-better,
title = {{Better Fine-Tuning via Instance Weighting for Text Classification}},
author = {Wang, Zhi and Bi, Wei and Wang, Yan and Liu, Xiaojiang},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2019},
pages = {7241-7248},
doi = {10.1609/AAAI.V33I01.33017241},
url = {https://mlanthology.org/aaai/2019/wang2019aaai-better/}
}