Efficient Kernel Learning from Side Information Using ADMM

Abstract

Side information is highly useful in the learning of a nonparametric kernel matrix. However, this often leads to an expensive semidefinite program (SDP). In recent years, a umber of dedicated solvers have been proposed. Though much better than off-the-shelf SDP solvers, they still cannot scale to large data sets. In this paper, we propose a novel solver based on the alternating direction method of multipliers (ADMM). The key idea is to use a low-rank decomposition of the kernel matrix K = V T U , with the constraint that V = U . The resultant optimization problem, though non-convex, has favorable convergence properties and can be efficiently solved without requiring eigen-decomposition in each iteration. Experimental results on a number of real-world data sets demonstrate that the proposed method is as accurate as directly solving the SDP, but can be one to two orders of magnitude faster.

Cite

Text

Hu and Kwok. "Efficient Kernel Learning from Side Information Using ADMM." International Joint Conference on Artificial Intelligence, 2013.

Markdown

[Hu and Kwok. "Efficient Kernel Learning from Side Information Using ADMM." International Joint Conference on Artificial Intelligence, 2013.](https://mlanthology.org/ijcai/2013/hu2013ijcai-efficient/)

BibTeX

@inproceedings{hu2013ijcai-efficient,
  title     = {{Efficient Kernel Learning from Side Information Using ADMM}},
  author    = {Hu, En-Liang and Kwok, James T.},
  booktitle = {International Joint Conference on Artificial Intelligence},
  year      = {2013},
  pages     = {1408-1414},
  url       = {https://mlanthology.org/ijcai/2013/hu2013ijcai-efficient/}
}