Boosting and Other Machine Learning Algorithms
Abstract
In an optical character recognition problem, we compare (as a function of training set size) the performance of three neural network based ensemble methods (two versions of boosting and a committee of neural networks trained independently) to that of a single network. In boosting, the number of patterns actually used for training is a subset of all potential training patterns. Based on either a fixed computational cost or training set size criterion, some version of boosting is best We also compare (for a fixed training set size) boosting to the following algorithms: optimal margin classifiers, tangent distance, local learning, k-nearest neighbor, and a large weight sharing network with the boosting algorithm showing the best performance.
Cite
Text
Drucker et al. "Boosting and Other Machine Learning Algorithms." International Conference on Machine Learning, 1994. doi:10.1016/B978-1-55860-335-6.50015-5Markdown
[Drucker et al. "Boosting and Other Machine Learning Algorithms." International Conference on Machine Learning, 1994.](https://mlanthology.org/icml/1994/drucker1994icml-boosting/) doi:10.1016/B978-1-55860-335-6.50015-5BibTeX
@inproceedings{drucker1994icml-boosting,
title = {{Boosting and Other Machine Learning Algorithms}},
author = {Drucker, Harris and Cortes, Corinna and Jackel, Lawrence D. and LeCun, Yann and Vapnik, Vladimir},
booktitle = {International Conference on Machine Learning},
year = {1994},
pages = {53-61},
doi = {10.1016/B978-1-55860-335-6.50015-5},
url = {https://mlanthology.org/icml/1994/drucker1994icml-boosting/}
}