FLIP: A Utility Preserving Privacy Mechanism for Time Series
Abstract
Guaranteeing privacy in released data is an important goal for data-producing agencies. There has been extensive research on developing suitable privacy mechanisms in recent years. Particularly notable is the idea of noise addition with the guarantee of differential privacy. There are, however, concerns about compromising data utility when very stringent privacy mechanisms are applied. Such compromises can be quite stark in correlated data, such as time series data. Adding white noise to a stochastic process may significantly change the correlation structure, a facet of the process that is essential to optimal prediction. We propose the use of all-pass filtering as a privacy mechanism for regularly sampled time series data, showing that this procedure preserves certain types of utility while also providing sufficient privacy guarantees to entity-level time series. Numerical studies explore the practical performance of the new method, and an empirical application to labor force data show the method's favorable utility properties in comparison to other competing privacy mechanisms.
Cite
Text
McElroy et al. "FLIP: A Utility Preserving Privacy Mechanism for Time Series." Journal of Machine Learning Research, 2023.Markdown
[McElroy et al. "FLIP: A Utility Preserving Privacy Mechanism for Time Series." Journal of Machine Learning Research, 2023.](https://mlanthology.org/jmlr/2023/mcelroy2023jmlr-flip/)BibTeX
@article{mcelroy2023jmlr-flip,
title = {{FLIP: A Utility Preserving Privacy Mechanism for Time Series}},
author = {McElroy, Tucker and Roy, Anindya and Hore, Gaurab},
journal = {Journal of Machine Learning Research},
year = {2023},
pages = {1-29},
volume = {24},
url = {https://mlanthology.org/jmlr/2023/mcelroy2023jmlr-flip/}
}