Graph–Smoothed Bayesian Black-Box Shift Estimator and Its Information Geometry

Abstract

Label shift adaptation aims to recover target class priors when the labelled source distribution $P$ and the unlabelled target distribution $Q$ share $P(X \mid Y) = Q(X \mid Y)$ but $P(Y) \neq Q(Y)$. Classical black‑box shift estimators invert an empirical confusion matrix of a frozen classifier, producing a brittle point estimate that ignores sampling noise and similarity among classes. We present Graph‑Smoothed Bayesian BBSE (GS‑B$^3$SE), a fully probabilistic alternative that places Laplacian–Gaussian priors on both target log‑priors and confusion‑matrix columns, tying them together on a label‑similarity graph. The resulting posterior is tractable with HMC or a fast block Newton–CG scheme. We prove identifiability, $N^{-1/2}$ contraction, variance bounds that shrink with the graph’s algebraic connectivity, and robustness to Laplacian misspecification. We also reinterpret GS‑B$^3$SE through information geometry, showing that it generalizes existing shift estimators.

Cite

Text

Kimura. "Graph–Smoothed Bayesian Black-Box Shift Estimator and Its Information Geometry." Advances in Neural Information Processing Systems, 2025.

Markdown

[Kimura. "Graph–Smoothed Bayesian Black-Box Shift Estimator and Its Information Geometry." Advances in Neural Information Processing Systems, 2025.](https://mlanthology.org/neurips/2025/kimura2025neurips-graphsmoothed/)

BibTeX

@inproceedings{kimura2025neurips-graphsmoothed,
  title     = {{Graph–Smoothed Bayesian Black-Box Shift Estimator and Its Information Geometry}},
  author    = {Kimura, Masanari},
  booktitle = {Advances in Neural Information Processing Systems},
  year      = {2025},
  url       = {https://mlanthology.org/neurips/2025/kimura2025neurips-graphsmoothed/}
}