Efficient Convex Relaxation for Transductive Support Vector Machine

Abstract

We consider the problem of Support Vector Machine transduction, which involves a combinatorial problem with exponential computational complexity in the number of unlabeled examples. Although several studies are devoted to Transductive SVM, they suffer either from the high computation complexity or from the solutions of local optimum. To address this problem, we propose solving Transductive SVM via a convex relaxation, which converts the NP-hard problem to a semi-definite programming. Compared with the other SDP relaxation for Transductive SVM, the proposed algorithm is computationally more efficient with the number of free parameters reduced from O(n2) to O(n) where n is the number of examples. Empirical study with several benchmark data sets shows the promising performance of the proposed algorithm in comparison with other state-of-the-art implementations of Transductive SVM.

Cite

Text

Xu et al. "Efficient Convex Relaxation for Transductive Support Vector Machine." Neural Information Processing Systems, 2007.

Markdown

[Xu et al. "Efficient Convex Relaxation for Transductive Support Vector Machine." Neural Information Processing Systems, 2007.](https://mlanthology.org/neurips/2007/xu2007neurips-efficient/)

BibTeX

@inproceedings{xu2007neurips-efficient,
  title     = {{Efficient Convex Relaxation for Transductive Support Vector Machine}},
  author    = {Xu, Zenglin and Jin, Rong and Zhu, Jianke and King, Irwin and Lyu, Michael},
  booktitle = {Neural Information Processing Systems},
  year      = {2007},
  pages     = {1641-1648},
  url       = {https://mlanthology.org/neurips/2007/xu2007neurips-efficient/}
}