Differentially Private Partitioned Variational Inference
Abstract
Learning a privacy-preserving model from sensitive data which are distributed across multiple devices is an increasingly important problem. The problem is often formulated in the federated learning context, with the aim of learning a single global model while keeping the data distributed. Moreover, Bayesian learning is a popular approach for modelling, since it naturally supports reliable uncertainty estimates. However, Bayesian learning is generally intractable even with centralised non-private data and so approximation techniques such as variational inference are a necessity. Variational inference has recently been extended to the non-private federated learning setting via the partitioned variational inference algorithm. For privacy protection, the current gold standard is called differential privacy. Differential privacy guarantees privacy in a strong, mathematically clearly defined sense. In this paper, we present differentially private partitioned variational inference, the first general framework for learning a variational approximation to a Bayesian posterior distribution in the federated learning setting while minimising the number of communication rounds and providing differential privacy guarantees for data subjects. We propose three alternative implementations in the general framework, one based on perturbing local optimisation runs done by individual parties, and two based on perturbing updates to the global model (one using a version of federated averaging, the second one adding virtual parties to the protocol), and compare their properties both theoretically and empirically. We show that perturbing the local optimisation works well with simple and complex models as long as each party has enough local data. However, the privacy is always guaranteed independently by each party. In contrast, perturbing the global updates works best with relatively simple models. Given access to suitable secure primitives, such as secure aggregation or secure shuffling, the performance can be improved by all parties guaranteeing privacy jointly.
Cite
Text
Heikkilä et al. "Differentially Private Partitioned Variational Inference." Transactions on Machine Learning Research, 2023.Markdown
[Heikkilä et al. "Differentially Private Partitioned Variational Inference." Transactions on Machine Learning Research, 2023.](https://mlanthology.org/tmlr/2023/heikkila2023tmlr-differentially/)BibTeX
@article{heikkila2023tmlr-differentially,
title = {{Differentially Private Partitioned Variational Inference}},
author = {Heikkilä, Mikko A. and Ashman, Matthew and Swaroop, Siddharth and Turner, Richard E and Honkela, Antti},
journal = {Transactions on Machine Learning Research},
year = {2023},
url = {https://mlanthology.org/tmlr/2023/heikkila2023tmlr-differentially/}
}