Training SVM with Indefinite Kernels

Abstract

Similarity matrices generated from many applications may not be positive semidefinite, and hence can't fit into the kernel machine framework. In this paper, we study the problem of training support vector machines with an indefinite kernel. We consider a regularized SVM formulation, in which the indefinite kernel matrix is treated as a noisy observation of some unknown positive semidefinite one (proxy kernel) and the support vectors and the proxy kernel can be computed simultaneously. We propose a semi-infinite quadratically constrained linear program formulation for the optimization, which can be solved iteratively to find a global optimum solution. We further propose to employ an additional pruning strategy, which significantly improves the efficiency of the algorithm, while retaining the convergence property of the algorithm. In addition, we show the close relationship between the proposed formulation and multiple kernel learning. Experiments on a collection of benchmark data sets demonstrate the efficiency and effectiveness of the proposed algorithm.

Cite

Text

Chen and Ye. "Training SVM with Indefinite Kernels." International Conference on Machine Learning, 2008. doi:10.1145/1390156.1390174

Markdown

[Chen and Ye. "Training SVM with Indefinite Kernels." International Conference on Machine Learning, 2008.](https://mlanthology.org/icml/2008/chen2008icml-training/) doi:10.1145/1390156.1390174

BibTeX

@inproceedings{chen2008icml-training,
  title     = {{Training SVM with Indefinite Kernels}},
  author    = {Chen, Jianhui and Ye, Jieping},
  booktitle = {International Conference on Machine Learning},
  year      = {2008},
  pages     = {136-143},
  doi       = {10.1145/1390156.1390174},
  url       = {https://mlanthology.org/icml/2008/chen2008icml-training/}
}