Non-Oblivious Performance of Random Projections
Abstract
Random projections are a cornerstone of high-dimensional computations. However, their analysis has proven both difficult and inadequate in capturing the empirically observed accuracy. To bridge this gap, this paper studies random projections from a novel perspective, focusing on data-dependent, that is, \emph{non-oblivious}, performance. The key contribution is the precise and data-dependent accuracy analysis of Rademacher random projections, achieved through elegant geometric methods of independent interest, namely, \emph{Schur-concavity}. The result formally states the following property: the less spread-out the data is, the better the accuracy. This leads to notable improvements in accuracy guarantees for data characterized by sparsity or distributed with a small spread. The key tool is a novel algebraic framework for proving Schur-concavity properties, which offers an alternative to derivative-based criteria commonly used in related studies. We demonstrate its value by providing an alternative proof for the extension of the celebrated Khintchine inequality.
Cite
Text
Skorski and Temperoni. "Non-Oblivious Performance of Random Projections." Proceedings of the 16th Asian Conference on Machine Learning, 2024.Markdown
[Skorski and Temperoni. "Non-Oblivious Performance of Random Projections." Proceedings of the 16th Asian Conference on Machine Learning, 2024.](https://mlanthology.org/acml/2024/skorski2024acml-nonoblivious/)BibTeX
@inproceedings{skorski2024acml-nonoblivious,
title = {{Non-Oblivious Performance of Random Projections}},
author = {Skorski, Maciej and Temperoni, Alessandro},
booktitle = {Proceedings of the 16th Asian Conference on Machine Learning},
year = {2024},
pages = {1128-1143},
volume = {260},
url = {https://mlanthology.org/acml/2024/skorski2024acml-nonoblivious/}
}