Deterministic Annealing for Semi-Supervised Kernel Machines
Abstract
An intuitive approach to utilizing unlabeled data in kernel-based classification algorithms is to simply treat unknown labels as additional optimization variables. For margin-based loss functions, one can view this approach as attempting to learn low-density separators. However, this is a hard optimization problem to solve in typical semi-supervised settings where unlabeled data is abundant. The popular Transductive SVM algorithm is a label-switching-retraining procedure that is known to be susceptible to local minima. In this paper, we present a global optimization framework for semi-supervised Kernel machines where an easier problem is parametrically deformed to the original hard problem and minimizers are smoothly tracked. Our approach is motivated from deterministic annealing techniques and involves a sequence of convex optimization problems that are exactly and efficiently solved. We present empirical results on several synthetic and real world datasets that demonstrate the effectiveness of our approach.
Cite
Text
Sindhwani et al. "Deterministic Annealing for Semi-Supervised Kernel Machines." International Conference on Machine Learning, 2006. doi:10.1145/1143844.1143950Markdown
[Sindhwani et al. "Deterministic Annealing for Semi-Supervised Kernel Machines." International Conference on Machine Learning, 2006.](https://mlanthology.org/icml/2006/sindhwani2006icml-deterministic/) doi:10.1145/1143844.1143950BibTeX
@inproceedings{sindhwani2006icml-deterministic,
title = {{Deterministic Annealing for Semi-Supervised Kernel Machines}},
author = {Sindhwani, Vikas and Keerthi, S. Sathiya and Chapelle, Olivier},
booktitle = {International Conference on Machine Learning},
year = {2006},
pages = {841-848},
doi = {10.1145/1143844.1143950},
url = {https://mlanthology.org/icml/2006/sindhwani2006icml-deterministic/}
}