Gaussian Quadrature for Matrix Inverse Forms with Applications
Abstract
We present a framework for accelerating a spectrum of machine learning algorithms that require computation of \emphbilinear inverse forms u^T A^-1u, where A is a positive definite matrix and u a given vector. Our framework is built on Gauss-type quadrature and easily scales to large, sparse matrices. Further, it allows retrospective computation of lower and upper bounds on u^T A^-1u, which in turn accelerates several algorithms. We prove that these bounds tighten iteratively and converge at a linear (geometric) rate. To our knowledge, ours is the first work to demonstrate these key properties of Gauss-type quadrature, which is a classical and deeply studied topic. We illustrate empirical consequences of our results by using quadrature to accelerate machine learning tasks involving determinantal point processes and submodular optimization, and observe tremendous speedups in several instances.
Cite
Text
Li et al. "Gaussian Quadrature for Matrix Inverse Forms with Applications." International Conference on Machine Learning, 2016.Markdown
[Li et al. "Gaussian Quadrature for Matrix Inverse Forms with Applications." International Conference on Machine Learning, 2016.](https://mlanthology.org/icml/2016/li2016icml-gaussian/)BibTeX
@inproceedings{li2016icml-gaussian,
title = {{Gaussian Quadrature for Matrix Inverse Forms with Applications}},
author = {Li, Chengtao and Sra, Suvrit and Jegelka, Stefanie},
booktitle = {International Conference on Machine Learning},
year = {2016},
pages = {1766-1775},
volume = {48},
url = {https://mlanthology.org/icml/2016/li2016icml-gaussian/}
}