GraphPrivatizer: Improved Structural Differential Privacy for Graph Neural Networks

Abstract

Graph privacy is crucial in systems that present a graph structure where the confidentiality and privacy of participants play a significant role in the integrity of the system itself. For instance, it is necessary to ensure the integrity of banking systems and transaction networks, protecting the privacy of customers' financial information and transaction details. We propose a method called GraphPrivatizer that privatizes the structure of a graph and protects it under Differential Privacy. GraphPrivatizer performs a controlled perturbation of the graph structure by randomly replacing the neighbors of a node with other similar neighbors, according to some similarity metric. With regard to neighbor perturbation, we find that aggregating features to compute similarities and imposing a minimum similarity score between the original and the replaced nodes provides the best privacy-utility trade-off. We use our method to train a Graph Neural Network server-side without disclosing users' private information to the server. We conduct experiments on real-world graph datasets and empirically evaluate the privacy of our models against privacy attacks.

Cite

Text

Joshi et al. "GraphPrivatizer: Improved Structural Differential Privacy for Graph Neural Networks." Transactions on Machine Learning Research, 2024.

Markdown

[Joshi et al. "GraphPrivatizer: Improved Structural Differential Privacy for Graph Neural Networks." Transactions on Machine Learning Research, 2024.](https://mlanthology.org/tmlr/2024/joshi2024tmlr-graphprivatizer/)

BibTeX

@article{joshi2024tmlr-graphprivatizer,
  title     = {{GraphPrivatizer: Improved Structural Differential Privacy for Graph Neural Networks}},
  author    = {Joshi, Rucha Bhalchandra and Indri, Patrick and Mishra, Subhankar},
  journal   = {Transactions on Machine Learning Research},
  year      = {2024},
  url       = {https://mlanthology.org/tmlr/2024/joshi2024tmlr-graphprivatizer/}
}