Reimagining Mutual Information for Enhanced Defense Against Data Leakage in Collaborative Inference
Abstract
Edge-cloud collaborative inference empowers resource-limited IoT devices to support deep learning applications without disclosing their raw data to the cloud server, thus protecting user's data. Nevertheless, prior research has shown that collaborative inference still results in the exposure of input and predictions from edge devices. To defend against such data leakage in collaborative inference, we introduce InfoScissors, a defense strategy designed to reduce the mutual information between a model's intermediate outcomes and the device's input and predictions. We evaluate our defense on several datasets in the context of diverse attacks. Besides the empirical comparison, we provide a theoretical analysis of the inadequacies of recent defense strategies that also utilize mutual information, particularly focusing on those based on the Variational Information Bottleneck (VIB) approach. We illustrate the superiority of our method and offer a theoretical analysis of it.
Cite
Text
Duan et al. "Reimagining Mutual Information for Enhanced Defense Against Data Leakage in Collaborative Inference." Neural Information Processing Systems, 2024. doi:10.52202/079017-1412Markdown
[Duan et al. "Reimagining Mutual Information for Enhanced Defense Against Data Leakage in Collaborative Inference." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/duan2024neurips-reimagining/) doi:10.52202/079017-1412BibTeX
@inproceedings{duan2024neurips-reimagining,
title = {{Reimagining Mutual Information for Enhanced Defense Against Data Leakage in Collaborative Inference}},
author = {Duan, Lin and Sun, Jingwei and Jia, Jinyuan and Chen, Yiran and Gorlatova, Maria},
booktitle = {Neural Information Processing Systems},
year = {2024},
doi = {10.52202/079017-1412},
url = {https://mlanthology.org/neurips/2024/duan2024neurips-reimagining/}
}