Feature-Cost Sensitive Learning with Submodular Trees of Classifiers

Abstract

During the past decade, machine learning algorithms have become commonplace in large-scale real-world industrial applications. In these settings, the computation time to train and test machine learning algorithms is a key consideration. At training-time the algorithms must scale to very large data set sizes.At testing-time, the cost of feature extraction can dominate the CPU runtime. Recently, a promising method was proposed to account for the feature extraction cost at testing time, called Cost-sensitive Tree of Classifiers (CSTC). Although the CSTC problem is NP-hard, the authors suggest an approximation through a mixed-norm relaxation across many classifiers. This relaxation is slow to train and requires involved optimization hyperparameter tuning. We propose a different relaxation using approximate submodularity, called Approximately Submodular Tree of Classifiers (ASTC). ASTC is much simpler to implement, yields equivalent results but requires no optimization hyperparameter tuning and is up to two orders of magnitude faster to train.

Cite

Text

Kusner et al. "Feature-Cost Sensitive Learning with Submodular Trees of Classifiers." AAAI Conference on Artificial Intelligence, 2014. doi:10.1609/AAAI.V28I1.8967

Markdown

[Kusner et al. "Feature-Cost Sensitive Learning with Submodular Trees of Classifiers." AAAI Conference on Artificial Intelligence, 2014.](https://mlanthology.org/aaai/2014/kusner2014aaai-feature/) doi:10.1609/AAAI.V28I1.8967

BibTeX

@inproceedings{kusner2014aaai-feature,
  title     = {{Feature-Cost Sensitive Learning with Submodular Trees of Classifiers}},
  author    = {Kusner, Matt J. and Chen, Wenlin and Zhou, Quan and Xu, Zhixiang Eddie and Weinberger, Kilian Q. and Chen, Yixin},
  booktitle = {AAAI Conference on Artificial Intelligence},
  year      = {2014},
  pages     = {1939-1945},
  doi       = {10.1609/AAAI.V28I1.8967},
  url       = {https://mlanthology.org/aaai/2014/kusner2014aaai-feature/}
}