Orthogonal Bootstrap: Efficient Simulation of Input Uncertainty
Abstract
Bootstrap is a popular methodology for simulating input uncertainty. However, it can be computationally expensive when the number of samples is large. We propose a new approach called Orthogonal Bootstrap that reduces the number of required Monte Carlo replications. We decomposes the target being simulated into two parts: the non-orthogonal part which has a closed-form result known as Infinitesimal Jackknife and the orthogonal part which is easier to be simulated. We theoretically and numerically show that Orthogonal Bootstrap significantly reduces the computational cost of Bootstrap while improving empirical accuracy and maintaining the same width of the constructed interval.
Cite
Text
Liu et al. "Orthogonal Bootstrap: Efficient Simulation of Input Uncertainty." International Conference on Machine Learning, 2024.Markdown
[Liu et al. "Orthogonal Bootstrap: Efficient Simulation of Input Uncertainty." International Conference on Machine Learning, 2024.](https://mlanthology.org/icml/2024/liu2024icml-orthogonal/)BibTeX
@inproceedings{liu2024icml-orthogonal,
title = {{Orthogonal Bootstrap: Efficient Simulation of Input Uncertainty}},
author = {Liu, Kaizhao and Blanchet, Jose and Ying, Lexing and Lu, Yiping},
booktitle = {International Conference on Machine Learning},
year = {2024},
pages = {30669-30701},
volume = {235},
url = {https://mlanthology.org/icml/2024/liu2024icml-orthogonal/}
}