Local Algorithms for Pattern Recognition and Dependencies Estimation

Abstract

In previous publications (Bottou and Vapnik 1992; Vapnik 1992) we described local learning algorithms, which result in performance improvements for real problems. We present here the theoretical framework on which these algorithms are based. First, we present a new statement of certain learning problems, namely the local risk minimization. We review the basic results of the uniform convergence theory of learning, and extend these results to local risk minimization. We also extend the structural risk minimization principle for both pattern recognition problems and regression problems. This extended induction principle is the basis for a new class of algorithms.

Cite

Text

Vapnik and Bottou. "Local Algorithms for Pattern Recognition and Dependencies Estimation." Neural Computation, 1993. doi:10.1162/NECO.1993.5.6.893

Markdown

[Vapnik and Bottou. "Local Algorithms for Pattern Recognition and Dependencies Estimation." Neural Computation, 1993.](https://mlanthology.org/neco/1993/vapnik1993neco-local/) doi:10.1162/NECO.1993.5.6.893

BibTeX

@article{vapnik1993neco-local,
  title     = {{Local Algorithms for Pattern Recognition and Dependencies Estimation}},
  author    = {Vapnik, Vladimir and Bottou, Léon},
  journal   = {Neural Computation},
  year      = {1993},
  pages     = {893-909},
  doi       = {10.1162/NECO.1993.5.6.893},
  volume    = {5},
  url       = {https://mlanthology.org/neco/1993/vapnik1993neco-local/}
}