Nearly Optimal Classification for Semimetrics

Abstract

We initiate the rigorous study of classification in semimetric spaces, which are point sets with a distance function that is non-negative and symmetric, but need not satisfy the triangle inequality. For metric spaces, the doubling dimension essentially characterizes both the runtime and sample complexity of classification algorithms --- yet we show that this is not the case for semimetrics. Instead, we define the {\em density dimension} and discover that it plays a central role in the statistical and algorithmic feasibility of learning in semimetric spaces. We present nearly optimal sample compression algorithms and use these to obtain generalization guarantees, including fast rates. The latter hold for general sample compression schemes and may be of independent interest.

Cite

Text

Gottlieb et al. "Nearly Optimal Classification for Semimetrics." International Conference on Artificial Intelligence and Statistics, 2016.

Markdown

[Gottlieb et al. "Nearly Optimal Classification for Semimetrics." International Conference on Artificial Intelligence and Statistics, 2016.](https://mlanthology.org/aistats/2016/gottlieb2016aistats-nearly/)

BibTeX

@inproceedings{gottlieb2016aistats-nearly,
  title     = {{Nearly Optimal Classification for Semimetrics}},
  author    = {Gottlieb, Lee-Ad and Kontorovich, Aryeh and Nisnevitch, Pinhas},
  booktitle = {International Conference on Artificial Intelligence and Statistics},
  year      = {2016},
  pages     = {379-388},
  url       = {https://mlanthology.org/aistats/2016/gottlieb2016aistats-nearly/}
}