Co-EM Support Vector Learning
Abstract
Multi-view algorithms, such as co-training and co-EM, utilize unlabeled datawhen the available attributes can be split into independent and compatiblesubsets. Co-EM outperforms co-training for many problems, but it requires theunderlying learner to estimate class probabilities, and to learn fromprobabilistically labeled data. Therefore, co-EM has so far only been studiedwith naive Bayesian learners. We cast linear classifiers into a probabilisticframework and develop a co-EM version of the Support Vector Machine. Weconduct experiments on text classification problems and compare the family ofsemi-supervised support vector algorithms under different conditions, including violations of the assumptions underlying multi-view learning. Forsome problems, such as course web page classification, we observe the mostaccurate results reported so far.
Cite
Text
Brefeld and Scheffer. "Co-EM Support Vector Learning." International Conference on Machine Learning, 2004. doi:10.1145/1015330.1015350Markdown
[Brefeld and Scheffer. "Co-EM Support Vector Learning." International Conference on Machine Learning, 2004.](https://mlanthology.org/icml/2004/brefeld2004icml-co/) doi:10.1145/1015330.1015350BibTeX
@inproceedings{brefeld2004icml-co,
title = {{Co-EM Support Vector Learning}},
author = {Brefeld, Ulf and Scheffer, Tobias},
booktitle = {International Conference on Machine Learning},
year = {2004},
doi = {10.1145/1015330.1015350},
url = {https://mlanthology.org/icml/2004/brefeld2004icml-co/}
}