A Scalable Second Order Method for Ill-Conditioned Matrix Completion from Few Samples
Abstract
We propose an iterative algorithm for low-rank matrix completion with that can be interpreted as an iteratively reweighted least squares (IRLS) algorithm, a saddle-escaping smoothing Newton method or a variable metric proximal gradient method applied to a non-convex rank surrogate. It combines the favorable data-efficiency of previous IRLS approaches with an improved scalability by several orders of magnitude. We establish the first local convergence guarantee from a minimal number of samples for that class of algorithms, showing that the method attains a local quadratic convergence rate. Furthermore, we show that the linear systems to be solved are well-conditioned even for very ill-conditioned ground truth matrices. We provide extensive experiments, indicating that unlike many state-of-the-art approaches, our method is able to complete very ill-conditioned matrices with a condition number of up to $10^{10}$ from few samples, while being competitive in its scalability.
Cite
Text
Kümmerle and Verdun. "A Scalable Second Order Method for Ill-Conditioned Matrix Completion from Few Samples." International Conference on Machine Learning, 2021.Markdown
[Kümmerle and Verdun. "A Scalable Second Order Method for Ill-Conditioned Matrix Completion from Few Samples." International Conference on Machine Learning, 2021.](https://mlanthology.org/icml/2021/kummerle2021icml-scalable/)BibTeX
@inproceedings{kummerle2021icml-scalable,
title = {{A Scalable Second Order Method for Ill-Conditioned Matrix Completion from Few Samples}},
author = {Kümmerle, Christian and Verdun, Claudio M.},
booktitle = {International Conference on Machine Learning},
year = {2021},
pages = {5872-5883},
volume = {139},
url = {https://mlanthology.org/icml/2021/kummerle2021icml-scalable/}
}