An Algorithm for L1 Nearest Neighbor Search via Monotonic Embedding

Abstract

Fast algorithms for nearest neighbor (NN) search have in large part focused on L2 distance. Here we develop an approach for L1 distance that begins with an explicit and exact embedding of the points into L2. We show how this embedding can efficiently be combined with random projection methods for L2 NN search, such as locality-sensitive hashing or random projection trees. We rigorously establish the correctness of the methodology and show by experimentation that it is competitive in practice with available alternatives.

Cite

Text

Wang and Dasgupta. "An Algorithm for L1 Nearest Neighbor Search via Monotonic Embedding." Neural Information Processing Systems, 2016.

Markdown

[Wang and Dasgupta. "An Algorithm for L1 Nearest Neighbor Search via Monotonic Embedding." Neural Information Processing Systems, 2016.](https://mlanthology.org/neurips/2016/wang2016neurips-algorithm/)

BibTeX

@inproceedings{wang2016neurips-algorithm,
  title     = {{An Algorithm for L1 Nearest Neighbor Search via Monotonic Embedding}},
  author    = {Wang, Xinan and Dasgupta, Sanjoy},
  booktitle = {Neural Information Processing Systems},
  year      = {2016},
  pages     = {983-991},
  url       = {https://mlanthology.org/neurips/2016/wang2016neurips-algorithm/}
}