Minimal Cost Complexity Pruning of Meta-Classifiers
Abstract
Integrating multiple learned classification models (classifiers) computed over large and (physically) distributed data sets has been demonstrated as an effective approach to scaling inductive learning techniques, while also boosting the accuracy of individual classifiers. These gains, however, come at the expense of an increased demand for run-time system resources. The final ensemble meta-classifier may consist of a large collection of base classifiers that require increased memory resources while also slowing down classification throughput. To classify unlabeled instances, predictions need to be generated from all base-classifiers before the meta-classifier can produce its final classification. The throughput (prediction rate) of a metaclassifier is of significant importance in real-time systems,
Cite
Text
Prodromidis and Stolfo. "Minimal Cost Complexity Pruning of Meta-Classifiers." AAAI Conference on Artificial Intelligence, 1999.Markdown
[Prodromidis and Stolfo. "Minimal Cost Complexity Pruning of Meta-Classifiers." AAAI Conference on Artificial Intelligence, 1999.](https://mlanthology.org/aaai/1999/prodromidis1999aaai-minimal/)BibTeX
@inproceedings{prodromidis1999aaai-minimal,
title = {{Minimal Cost Complexity Pruning of Meta-Classifiers}},
author = {Prodromidis, Andreas L. and Stolfo, Salvatore J.},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {1999},
pages = {979},
url = {https://mlanthology.org/aaai/1999/prodromidis1999aaai-minimal/}
}