The Set Covering Machine with Data-Dependent Half-Spaces
Abstract
We examine the set covering machine when it uses data-dependent half-spaces for its set of features and bound its generalization error in terms of the number of training errors and the number of half-spaces it achieves on the training data. We show that it provides a favorable alternative to data-dependent balls on some natural data sets. Compared to the support vector machine, the set covering machine with data-dependent halfspaces produces substantially sparser classifiers with comparable (and sometimes better) generalization. Furthermore, we show that our bound on the generalization error provides an effective guide for model selection. ICML Proceedings of the Twentieth International Conference on Machine Learning
Cite
Text
Marchand et al. "The Set Covering Machine with Data-Dependent Half-Spaces." International Conference on Machine Learning, 2003.Markdown
[Marchand et al. "The Set Covering Machine with Data-Dependent Half-Spaces." International Conference on Machine Learning, 2003.](https://mlanthology.org/icml/2003/marchand2003icml-set/)BibTeX
@inproceedings{marchand2003icml-set,
title = {{The Set Covering Machine with Data-Dependent Half-Spaces}},
author = {Marchand, Mario and Shah, Mohak and Shawe-Taylor, John and Sokolova, Marina},
booktitle = {International Conference on Machine Learning},
year = {2003},
pages = {520-527},
url = {https://mlanthology.org/icml/2003/marchand2003icml-set/}
}