Optimal Bounds for Johnson-Lindenstrauss Transformations
Abstract
In 1984, Johnson and Lindenstrauss proved that any finite set of data in a high-dimensional space can be projected to a lower-dimensional space while preserving the pairwise Euclidean distances between points up to a bounded relative error. If the desired dimension of the image is too small, however, Kane, Meka, and Nelson (2011) and Jayram and Woodruff (2013) proved that such a projection does not exist. In this paper, we provide a precise asymptotic threshold for the dimension of the image, above which, there exists a projection preserving the Euclidean distance, but, below which, there does not exist such a projection.
Cite
Text
Burr et al. "Optimal Bounds for Johnson-Lindenstrauss Transformations." Journal of Machine Learning Research, 2018.Markdown
[Burr et al. "Optimal Bounds for Johnson-Lindenstrauss Transformations." Journal of Machine Learning Research, 2018.](https://mlanthology.org/jmlr/2018/burr2018jmlr-optimal/)BibTeX
@article{burr2018jmlr-optimal,
title = {{Optimal Bounds for Johnson-Lindenstrauss Transformations}},
author = {Burr, Michael and Gao, Shuhong and Knoll, Fiona},
journal = {Journal of Machine Learning Research},
year = {2018},
pages = {1-22},
volume = {19},
url = {https://mlanthology.org/jmlr/2018/burr2018jmlr-optimal/}
}