Distribution Aware Active Learning via Gaussian Mixtures
Abstract
In this paper, we propose a distribution-aware active learning strategy that captures and mitigates the distribution discrepancy between the labeled and unlabeled sets to cope with overfitting. By taking advantage of gaussian mixture models (GMM) and Wasserstein distance, we first design a distribution-aware training strategy to improve the model performance. Then, we introduce a hybrid informativeness metric for active learning which considers both likelihood-based and model-based information simultaneously. Experimental results on four different datasets show the effectiveness of our method against existing active learning baselines.
Cite
Text
Park et al. "Distribution Aware Active Learning via Gaussian Mixtures." ICLR 2023 Workshops: Trustworthy_ML, 2023.Markdown
[Park et al. "Distribution Aware Active Learning via Gaussian Mixtures." ICLR 2023 Workshops: Trustworthy_ML, 2023.](https://mlanthology.org/iclrw/2023/park2023iclrw-distribution/)BibTeX
@inproceedings{park2023iclrw-distribution,
title = {{Distribution Aware Active Learning via Gaussian Mixtures}},
author = {Park, Younghyun and Han, Dong-Jun and Park, Jungwuk and Choi, Wonjeong and Kousar, Humaira and Moon, Jaekyun},
booktitle = {ICLR 2023 Workshops: Trustworthy_ML},
year = {2023},
url = {https://mlanthology.org/iclrw/2023/park2023iclrw-distribution/}
}