Efficient Low-Rank Stochastic Gradient Descent Methods for Solving Semidefinite Programs
Abstract
We propose a low-rank stochastic gradient descent (LR-SGD) method for solving a class of semidefinite programming (SDP) problems. LR-SGD has clear computational advantages over the standard SGD peers as its iterative projection step (a SDP problem) can be solved in an efficient manner. Specifically, LR-SGD constructs a low-rank stochastic gradient and computes an optimal solution to the projection step via analyzing the low-rank structure of its stochastic gradient. Moreover, our theoretical analysis shows the universal existence of arbitrary low-rank stochastic gradients which in turn validates the rationale of the LR-SGD method. Since LR-SGD is a SGD based method, it achieves the optimal convergence rates of the standard SGD methods. The presented experimental results demonstrate the efficiency and effectiveness of the LR-SGD method.
Cite
Text
Chen et al. "Efficient Low-Rank Stochastic Gradient Descent Methods for Solving Semidefinite Programs." International Conference on Artificial Intelligence and Statistics, 2014.Markdown
[Chen et al. "Efficient Low-Rank Stochastic Gradient Descent Methods for Solving Semidefinite Programs." International Conference on Artificial Intelligence and Statistics, 2014.](https://mlanthology.org/aistats/2014/chen2014aistats-efficient/)BibTeX
@inproceedings{chen2014aistats-efficient,
title = {{Efficient Low-Rank Stochastic Gradient Descent Methods for Solving Semidefinite Programs}},
author = {Chen, Jianhui and Yang, Tianbao and Zhu, Shenghuo},
booktitle = {International Conference on Artificial Intelligence and Statistics},
year = {2014},
pages = {122-130},
url = {https://mlanthology.org/aistats/2014/chen2014aistats-efficient/}
}