Nonlinear Estimators and Tail Bounds for Dimension Reduction in L1 Using Cauchy Random Projections
Abstract
For dimension reduction in the l1 norm, the method of Cauchy random projections multiplies the original data matrix A ∈ ℝn×D with a random matrix R ∈ ℝD×k (k≪D) whose entries are i.i.d. samples of the standard Cauchy C(0,1). Because of the impossibility result, one can not hope to recover the pairwise l1 distances in A from B=A×R∈ ℝn×k, using linear estimators without incurring large errors. However, nonlinear estimators are still useful for certain applications in data stream computations, information retrieval, learning, and data mining.
Cite
Text
Li et al. "Nonlinear Estimators and Tail Bounds for Dimension Reduction in L1 Using Cauchy Random Projections." Journal of Machine Learning Research, 2007.Markdown
[Li et al. "Nonlinear Estimators and Tail Bounds for Dimension Reduction in L1 Using Cauchy Random Projections." Journal of Machine Learning Research, 2007.](https://mlanthology.org/jmlr/2007/li2007jmlr-nonlinear/)BibTeX
@article{li2007jmlr-nonlinear,
title = {{Nonlinear Estimators and Tail Bounds for Dimension Reduction in L1 Using Cauchy Random Projections}},
author = {Li, Ping and Hastie, Trevor J. and Church, Kenneth W.},
journal = {Journal of Machine Learning Research},
year = {2007},
pages = {2497-2532},
volume = {8},
url = {https://mlanthology.org/jmlr/2007/li2007jmlr-nonlinear/}
}