Linear Regression Under Fixed-Rank Constraints: A Riemannian Approach
Abstract
In this paper, we tackle the problem of learning a linear regression model whose parameter is a fixed-rank matrix. We study the Riemannian manifold geometry of the set of fixed-rank matrices and develop efficient line-search algorithms. The proposed algorithms have many applications, scale to high-dimensional problems, enjoy local convergence properties and confer a geometric basis to recent contributions on learning fixed-rank matrices. Numerical experiments on benchmarks suggest that the proposed algorithms compete with the state-of-the-art, and that manifold optimization offers a versatile framework for the design of rank-constrained machine learning algorithms.
Cite
Text
Meyer et al. "Linear Regression Under Fixed-Rank Constraints: A Riemannian Approach." International Conference on Machine Learning, 2011.Markdown
[Meyer et al. "Linear Regression Under Fixed-Rank Constraints: A Riemannian Approach." International Conference on Machine Learning, 2011.](https://mlanthology.org/icml/2011/meyer2011icml-linear/)BibTeX
@inproceedings{meyer2011icml-linear,
title = {{Linear Regression Under Fixed-Rank Constraints: A Riemannian Approach}},
author = {Meyer, Gilles and Bonnabel, Silvère and Sepulchre, Rodolphe},
booktitle = {International Conference on Machine Learning},
year = {2011},
pages = {545-552},
url = {https://mlanthology.org/icml/2011/meyer2011icml-linear/}
}