Nuclear Norm Regularized Least Squares Optimization on Grassmannian Manifolds
Abstract
This paper aims to address a class of nuclear norm regularized least square (NNLS) problems. By exploiting the underlying low-rank matrix manifold structure, the problem with nuclear norm regularization is cast to a Riemannian opti-mization problem over matrix manifolds. Com-pared with existing NNLS algorithms involving singular value decomposition (SVD) of large-scale matrices, our method achieves significant reduction in computational complexity. More-over, the uniqueness of matrix factorization can be guaranteed by our Grassmannian manifold method. In our solution, we first introduce the bilateral factorization into the original NNLS problem and convert it into a Grassmannian op-timization problem by using a linearized tech-nique. Then the conjugate gradient procedure on the Grassmannian manifold is developed for our method with a guarantee of local convergence. Finally, our method can be extended to address the graph regularized problem. Experimental re-sults verified both the efficiency and effective-ness of our method. 1
Cite
Text
Liu et al. "Nuclear Norm Regularized Least Squares Optimization on Grassmannian Manifolds." Conference on Uncertainty in Artificial Intelligence, 2014.Markdown
[Liu et al. "Nuclear Norm Regularized Least Squares Optimization on Grassmannian Manifolds." Conference on Uncertainty in Artificial Intelligence, 2014.](https://mlanthology.org/uai/2014/liu2014uai-nuclear/)BibTeX
@inproceedings{liu2014uai-nuclear,
title = {{Nuclear Norm Regularized Least Squares Optimization on Grassmannian Manifolds}},
author = {Liu, Yuanyuan and Shang, Fanhua and Cheng, Hong and Cheng, James},
booktitle = {Conference on Uncertainty in Artificial Intelligence},
year = {2014},
pages = {515-524},
url = {https://mlanthology.org/uai/2014/liu2014uai-nuclear/}
}