Herding Dynamical Weights to Learn
Abstract
A new ``herding'' algorithm is proposed which directly converts observed moments into a sequence of pseudo-samples. The pseudo-samples respect the moment constraints and may be used to estimate (unobserved) quantities of interest. The procedure allows us to sidestep the usual approach of first learning a joint model (which is intractable) and then sampling from that model (which can easily get stuck in a local mode). Moreover, the algorithm is fully deterministic, avoiding random number generation) and does not need expensive operations such as exponentiation.
Cite
Text
Welling. "Herding Dynamical Weights to Learn." International Conference on Machine Learning, 2009. doi:10.1145/1553374.1553517Markdown
[Welling. "Herding Dynamical Weights to Learn." International Conference on Machine Learning, 2009.](https://mlanthology.org/icml/2009/welling2009icml-herding/) doi:10.1145/1553374.1553517BibTeX
@inproceedings{welling2009icml-herding,
title = {{Herding Dynamical Weights to Learn}},
author = {Welling, Max},
booktitle = {International Conference on Machine Learning},
year = {2009},
pages = {1121-1128},
doi = {10.1145/1553374.1553517},
url = {https://mlanthology.org/icml/2009/welling2009icml-herding/}
}