PAB-Decisions for Boolean and Real-Valued Features
Abstract
In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean or real entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian discriminant functions for this purpose. We analyze this approach for different classes of distribution functions: In the Boolean case we look at the k-th order Bahadur-Lazarsfeld expansions and k-th order Chow expansions and in the continuous case at the class of normal distributions. In all cases, we obtain polynomial upper bounds for the required sample size. The bounds for the Boolean case improve and extend results from [FPS91].
Cite
Text
Anoulova et al. "PAB-Decisions for Boolean and Real-Valued Features." Annual Conference on Computational Learning Theory, 1992. doi:10.1145/130385.130425Markdown
[Anoulova et al. "PAB-Decisions for Boolean and Real-Valued Features." Annual Conference on Computational Learning Theory, 1992.](https://mlanthology.org/colt/1992/anoulova1992colt-pab/) doi:10.1145/130385.130425BibTeX
@inproceedings{anoulova1992colt-pab,
title = {{PAB-Decisions for Boolean and Real-Valued Features}},
author = {Anoulova, Svetlana and Fischer, Paul and Pölt, Stefan and Simon, Hans Ulrich},
booktitle = {Annual Conference on Computational Learning Theory},
year = {1992},
pages = {353-362},
doi = {10.1145/130385.130425},
url = {https://mlanthology.org/colt/1992/anoulova1992colt-pab/}
}