Γ-ABC: Outlier-Robust Approximate Bayesian Computation Based on a Robust Divergence Estimator
Abstract
Approximate Bayesian computation (ABC) is a likelihood-free inference method that has been employed in various applications. However, ABC can be sensitive to outliers if a data discrepancy measure is chosen inappropriately. In this paper, we propose to use a nearest-neighbor-based γ-divergence estimator as a data discrepancy measure. We show that our estimator possesses a suitable robustness property called the redescending property. In addition, our estimator enjoys various desirable properties such as high flexibility, asymptotic unbiasedness, almost sure convergence, and linear time complexity. Through experiments, we demonstrate that our method achieves significantly higher robustness than existing discrepancy measures.
Cite
Text
Fujisawa et al. " Γ-ABC: Outlier-Robust Approximate Bayesian Computation Based on a Robust Divergence Estimator ." Artificial Intelligence and Statistics, 2021.Markdown
[Fujisawa et al. " Γ-ABC: Outlier-Robust Approximate Bayesian Computation Based on a Robust Divergence Estimator ." Artificial Intelligence and Statistics, 2021.](https://mlanthology.org/aistats/2021/fujisawa2021aistats-abc/)BibTeX
@inproceedings{fujisawa2021aistats-abc,
title = {{ Γ-ABC: Outlier-Robust Approximate Bayesian Computation Based on a Robust Divergence Estimator }},
author = {Fujisawa, Masahiro and Teshima, Takeshi and Sato, Issei and Sugiyama, Masashi},
booktitle = {Artificial Intelligence and Statistics},
year = {2021},
pages = {1783-1791},
volume = {130},
url = {https://mlanthology.org/aistats/2021/fujisawa2021aistats-abc/}
}