Online Variance Reduction with Mixtures
Abstract
Adaptive importance sampling for stochastic optimization is a promising approach that offers improved convergence through variance reduction. In this work, we propose a new framework for variance reduction that enables the use of mixtures over predefined sampling distributions, which can naturally encode prior knowledge about the data. While these sampling distributions are fixed, the mixture weights are adapted during the optimization process. We propose VRM, a novel and efficient adaptive scheme that asymptotically recovers the best mixture weights in hindsight and can also accommodate sampling distributions over sets of points. We empirically demonstrate the versatility of VRM in a range of applications.
Cite
Text
Borsos et al. "Online Variance Reduction with Mixtures." International Conference on Machine Learning, 2019.Markdown
[Borsos et al. "Online Variance Reduction with Mixtures." International Conference on Machine Learning, 2019.](https://mlanthology.org/icml/2019/borsos2019icml-online/)BibTeX
@inproceedings{borsos2019icml-online,
title = {{Online Variance Reduction with Mixtures}},
author = {Borsos, Zalán and Curi, Sebastian and Levy, Kfir Yehuda and Krause, Andreas},
booktitle = {International Conference on Machine Learning},
year = {2019},
pages = {705-714},
volume = {97},
url = {https://mlanthology.org/icml/2019/borsos2019icml-online/}
}