Does Head Label Help for Long-Tailed Multi-Label Text Classification
Abstract
Multi-label text classification (MLTC) aims to annotate documents with the most relevant labels from a number of candidate labels. In real applications, the distribution of label frequency often exhibits a long tail, i.e., a few labels are associated with a large number of documents (a.k.a. head labels), while a large fraction of labels are associated with a small number of documents (a.k.a. tail labels). To address the challenge of insufficient training data on tail label classification, we propose a Head-to-Tail Network (HTTN) to transfer the meta-knowledge from the data-rich head labels to data-poor tail labels. The meta-knowledge is the mapping from few-shot network parameters to many-shot network parameters, which aims to promote the generalizability of tail classifiers. Extensive experimental results on three benchmark datasets demonstrate that HTTN consistently outperforms the state-of-the-art methods. The code and hyper-parameter settings are released for reproducibility.
Cite
Text
Xiao et al. "Does Head Label Help for Long-Tailed Multi-Label Text Classification." AAAI Conference on Artificial Intelligence, 2021. doi:10.1609/AAAI.V35I16.17660Markdown
[Xiao et al. "Does Head Label Help for Long-Tailed Multi-Label Text Classification." AAAI Conference on Artificial Intelligence, 2021.](https://mlanthology.org/aaai/2021/xiao2021aaai-head/) doi:10.1609/AAAI.V35I16.17660BibTeX
@inproceedings{xiao2021aaai-head,
title = {{Does Head Label Help for Long-Tailed Multi-Label Text Classification}},
author = {Xiao, Lin and Zhang, Xiangliang and Jing, Liping and Huang, Chi and Song, Mingyang},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2021},
pages = {14103-14111},
doi = {10.1609/AAAI.V35I16.17660},
url = {https://mlanthology.org/aaai/2021/xiao2021aaai-head/}
}