Fixed-Parameter and Approximation Algorithms for PCA with Outliers

Abstract

PCA with Outliers is the fundamental problem of identifying an underlying low-dimensional subspace in a data set corrupted with outliers. A large body of work is devoted to the information-theoretic aspects of this problem. However, from the computational perspective, its complexity is still not well-understood. We study this problem from the perspective of parameterized complexity by investigating how parameters like the dimension of the data, the subspace dimension, the number of outliers and their structure, and approximation error, influence the computational complexity of the problem. Our algorithmic methods are based on techniques of randomized linear algebra and algebraic geometry.

Cite

Text

Dahiya et al. "Fixed-Parameter and Approximation Algorithms for PCA with Outliers." International Conference on Machine Learning, 2021.

Markdown

[Dahiya et al. "Fixed-Parameter and Approximation Algorithms for PCA with Outliers." International Conference on Machine Learning, 2021.](https://mlanthology.org/icml/2021/dahiya2021icml-fixedparameter/)

BibTeX

@inproceedings{dahiya2021icml-fixedparameter,
  title     = {{Fixed-Parameter and Approximation Algorithms for PCA with Outliers}},
  author    = {Dahiya, Yogesh and Fomin, Fedor and Panolan, Fahad and Simonov, Kirill},
  booktitle = {International Conference on Machine Learning},
  year      = {2021},
  pages     = {2341-2351},
  volume    = {139},
  url       = {https://mlanthology.org/icml/2021/dahiya2021icml-fixedparameter/}
}