Principal Component Analysis in the Local Differential Privacy Model
Abstract
In this paper, we study the Principal Component Analysis (PCA) problem under the (distributed) non-interactive local differential privacy model. For the low dimensional case, we show the optimal rate for the private minimax risk of the k-dimensional PCA using the squared subspace distance as the measurement. For the high dimensional row sparse case, we first give a lower bound on the private minimax risk, . Then we provide an efficient algorithm to achieve a near optimal upper bound. Experiments on both synthetic and real world datasets confirm the theoretical guarantees of our algorithms.
Cite
Text
Wang and Xu. "Principal Component Analysis in the Local Differential Privacy Model." International Joint Conference on Artificial Intelligence, 2019. doi:10.24963/IJCAI.2019/666Markdown
[Wang and Xu. "Principal Component Analysis in the Local Differential Privacy Model." International Joint Conference on Artificial Intelligence, 2019.](https://mlanthology.org/ijcai/2019/wang2019ijcai-principal/) doi:10.24963/IJCAI.2019/666BibTeX
@inproceedings{wang2019ijcai-principal,
title = {{Principal Component Analysis in the Local Differential Privacy Model}},
author = {Wang, Di and Xu, Jinhui},
booktitle = {International Joint Conference on Artificial Intelligence},
year = {2019},
pages = {4795-4801},
doi = {10.24963/IJCAI.2019/666},
url = {https://mlanthology.org/ijcai/2019/wang2019ijcai-principal/}
}