Subspace Learning with Partial Information
Abstract
The goal of subspace learning is to find a $k$-dimensional subspace of $\mathbb{R}^d$, such that the expected squared distance between instance vectors and the subspace is as small as possible. In this paper we study subspace learning in a partial information setting, in which the learner can only observe $r \le d$ attributes from each instance vector. We propose several efficient algorithms for this task, and analyze their sample complexity.
Cite
Text
Gonen et al. "Subspace Learning with Partial Information." Journal of Machine Learning Research, 2016.Markdown
[Gonen et al. "Subspace Learning with Partial Information." Journal of Machine Learning Research, 2016.](https://mlanthology.org/jmlr/2016/gonen2016jmlr-subspace/)BibTeX
@article{gonen2016jmlr-subspace,
title = {{Subspace Learning with Partial Information}},
author = {Gonen, Alon and Rosenbaum, Dan and Eldar, Yonina C. and Shalev-Shwartz, Shai},
journal = {Journal of Machine Learning Research},
year = {2016},
pages = {1-21},
volume = {17},
url = {https://mlanthology.org/jmlr/2016/gonen2016jmlr-subspace/}
}