Centroid Approximation for Bootstrap: Improving Particle Quality at Inference
Abstract
Bootstrap is a principled and powerful frequentist statistical tool for uncertainty quantification. Unfortunately, standard bootstrap methods are computationally intensive due to the need of drawing a large i.i.d. bootstrap sample to approximate the ideal bootstrap distribution; this largely hinders their application in large-scale machine learning, especially deep learning problems. In this work, we propose an efficient method to explicitly optimize a small set of high quality “centroid” points to better approximate the ideal bootstrap distribution. We achieve this by minimizing a simple objective function that is asymptotically equivalent to the Wasserstein distance to the ideal bootstrap distribution. This allows us to provide an accurate estimation of uncertainty with a small number of bootstrap centroids, outperforming the naive i.i.d. sampling approach. Empirically, we show that our method can boost the performance of bootstrap in a variety of applications.
Cite
Text
Ye and Liu. "Centroid Approximation for Bootstrap: Improving Particle Quality at Inference." International Conference on Machine Learning, 2022.Markdown
[Ye and Liu. "Centroid Approximation for Bootstrap: Improving Particle Quality at Inference." International Conference on Machine Learning, 2022.](https://mlanthology.org/icml/2022/ye2022icml-centroid/)BibTeX
@inproceedings{ye2022icml-centroid,
title = {{Centroid Approximation for Bootstrap: Improving Particle Quality at Inference}},
author = {Ye, Mao and Liu, Qiang},
booktitle = {International Conference on Machine Learning},
year = {2022},
pages = {25469-25489},
volume = {162},
url = {https://mlanthology.org/icml/2022/ye2022icml-centroid/}
}