Linear Regression Without Correspondence
Abstract
This article considers algorithmic and statistical aspects of linear regression when the correspondence between the covariates and the responses is unknown. First, a fully polynomial-time approximation scheme is given for the natural least squares optimization problem in any constant dimension. Next, in an average-case and noise-free setting where the responses exactly correspond to a linear function of i.i.d. draws from a standard multivariate normal distribution, an efficient algorithm based on lattice basis reduction is shown to exactly recover the unknown linear function in arbitrary dimension. Finally, lower bounds on the signal-to-noise ratio are established for approximate recovery of the unknown linear function by any estimator.
Cite
Text
Hsu et al. "Linear Regression Without Correspondence." Neural Information Processing Systems, 2017.Markdown
[Hsu et al. "Linear Regression Without Correspondence." Neural Information Processing Systems, 2017.](https://mlanthology.org/neurips/2017/hsu2017neurips-linear/)BibTeX
@inproceedings{hsu2017neurips-linear,
title = {{Linear Regression Without Correspondence}},
author = {Hsu, Daniel J. and Shi, Kevin and Sun, Xiaorui},
booktitle = {Neural Information Processing Systems},
year = {2017},
pages = {1531-1540},
url = {https://mlanthology.org/neurips/2017/hsu2017neurips-linear/}
}