Quasi-Newton Methods: A New Direction
Abstract
Four decades after their invention, quasi-Newton methods are still state of the art in unconstrained numerical optimization. Although not usually interpreted thus, these are learning algorithms that fit a local quadratic approximation to the objective function. We show that many, including the most popular, quasi-Newton methods can be interpreted as approximations of Bayesian linear regression under varying prior assumptions. This new notion elucidates some shortcomings of classical algorithms, and lights the way to a novel nonparametric quasi-Newton method, which is able to make more efficient use of available information at computational cost similar to its predecessors.
Cite
Text
Hennig and Kiefel. "Quasi-Newton Methods: A New Direction." International Conference on Machine Learning, 2012. doi:10.5555/2567709.2502608Markdown
[Hennig and Kiefel. "Quasi-Newton Methods: A New Direction." International Conference on Machine Learning, 2012.](https://mlanthology.org/icml/2012/hennig2012icml-quasi/) doi:10.5555/2567709.2502608BibTeX
@inproceedings{hennig2012icml-quasi,
title = {{Quasi-Newton Methods: A New Direction}},
author = {Hennig, Philipp and Kiefel, Martin},
booktitle = {International Conference on Machine Learning},
year = {2012},
doi = {10.5555/2567709.2502608},
url = {https://mlanthology.org/icml/2012/hennig2012icml-quasi/}
}