Differentially Private Stochastic Expectation Propagation
Abstract
We are interested in privatizing an approximate posterior inference algorithm, called Expectation Propagation (EP). EP approximates the posterior distribution by iteratively refining approximations to the local likelihood terms. By doing so, EP typically provides better posterior uncertainties than variational inference (VI) which globally approximates the likelihood term. However, EP needs a large memory to maintain all local approximations associated with each datapoint in the training data. To overcome this challenge, stochastic expectation propagation (SEP) considers a single unique local factor that captures the average effect of each likelihood term to the posterior and refines it in a way analogous to EP. In terms of privatization, SEP is more tractable than EP. It is because at each factor’s refining step we fix the remaining factors, where these factors are independent of other datapoints, which is different from EP. This independence makes the sensitivity analysis straightforward. We provide a theoretical analysis of the privacy-accuracy trade-off in the posterior distributions under our method, which we call differentially private stochastic expectation propagation (DP-SEP). Furthermore, we test the DP-SEP algorithm on both synthetic and real-world datasets and evaluate the quality of posterior estimates at different levels of guaranteed privacy.
Cite
Text
Vinaroz and Park. "Differentially Private Stochastic Expectation Propagation." Transactions on Machine Learning Research, 2022.Markdown
[Vinaroz and Park. "Differentially Private Stochastic Expectation Propagation." Transactions on Machine Learning Research, 2022.](https://mlanthology.org/tmlr/2022/vinaroz2022tmlr-differentially/)BibTeX
@article{vinaroz2022tmlr-differentially,
title = {{Differentially Private Stochastic Expectation Propagation}},
author = {Vinaroz, Margarita and Park, Mijung},
journal = {Transactions on Machine Learning Research},
year = {2022},
url = {https://mlanthology.org/tmlr/2022/vinaroz2022tmlr-differentially/}
}