ELSA: Efficient Label Shift Adaptation Through the Lens of Semiparametric Models
Abstract
We study the domain adaptation problem with label shift in this work. Under the label shift context, the marginal distribution of the label varies across the training and testing datasets, while the conditional distribution of features given the label is the same. Traditional label shift adaptation methods either suffer from large estimation errors or require cumbersome post-prediction calibrations. To address these issues, we first propose a moment-matching framework for adapting the label shift based on the geometry of the influence function. Under such a framework, we propose a novel method named $\underline{\mathrm{E}}$fficient $\underline{\mathrm{L}}$abel $\underline{\mathrm{S}}$hift $\underline{\mathrm{A}}$daptation (ELSA), in which the adaptation weights can be estimated by solving linear systems. Theoretically, the ELSA estimator is $\sqrt{n}$-consistent ($n$ is the sample size of the source data) and asymptotically normal. Empirically, we show that ELSA can achieve state-of-the-art estimation performances without post-prediction calibrations, thus, gaining computational efficiency.
Cite
Text
Tian et al. "ELSA: Efficient Label Shift Adaptation Through the Lens of Semiparametric Models." International Conference on Machine Learning, 2023.Markdown
[Tian et al. "ELSA: Efficient Label Shift Adaptation Through the Lens of Semiparametric Models." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/tian2023icml-elsa/)BibTeX
@inproceedings{tian2023icml-elsa,
title = {{ELSA: Efficient Label Shift Adaptation Through the Lens of Semiparametric Models}},
author = {Tian, Qinglong and Zhang, Xin and Zhao, Jiwei},
booktitle = {International Conference on Machine Learning},
year = {2023},
pages = {34120-34142},
volume = {202},
url = {https://mlanthology.org/icml/2023/tian2023icml-elsa/}
}