Tractable Structured Natural-Gradient Descent Using Local Parameterizations
Abstract
Natural-gradient descent (NGD) on structured parameter spaces (e.g., low-rank covariances) is computationally challenging due to difficult Fisher-matrix computations. We address this issue by using \emph{local-parameter coordinates} to obtain a flexible and efficient NGD method that works well for a wide-variety of structured parameterizations. We show four applications where our method (1) generalizes the exponential natural evolutionary strategy, (2) recovers existing Newton-like algorithms, (3) yields new structured second-order algorithms, and (4) gives new algorithms to learn covariances of Gaussian and Wishart-based distributions. We show results on a range of problems from deep learning, variational inference, and evolution strategies. Our work opens a new direction for scalable structured geometric methods.
Cite
Text
Lin et al. "Tractable Structured Natural-Gradient Descent Using Local Parameterizations." International Conference on Machine Learning, 2021.Markdown
[Lin et al. "Tractable Structured Natural-Gradient Descent Using Local Parameterizations." International Conference on Machine Learning, 2021.](https://mlanthology.org/icml/2021/lin2021icml-tractable/)BibTeX
@inproceedings{lin2021icml-tractable,
title = {{Tractable Structured Natural-Gradient Descent Using Local Parameterizations}},
author = {Lin, Wu and Nielsen, Frank and Emtiyaz, Khan Mohammad and Schmidt, Mark},
booktitle = {International Conference on Machine Learning},
year = {2021},
pages = {6680-6691},
volume = {139},
url = {https://mlanthology.org/icml/2021/lin2021icml-tractable/}
}