Co-Dream: Collaborative Data Synthesis with Decentralized Models

Abstract

We present a framework for distributed optimization that addresses the decentralized and siloed nature of data in the real world. Existing works in Federated Learning address it by learning a centralized model from decentralized data. Our framework \textit{Co-Dream} instead focuses on learning the representation of data itself. By starting with random data and jointly synthesizing samples from distributed clients, we aim to create proxies that represent the global data distribution. Importantly, this collaborative synthesis is achieved using only local models, ensuring privacy comparable to sharing the model itself. The collaboration among clients is facilitated through federated optimization in the data space, leveraging shared input gradients based on local loss. This collaborative data synthesis offers various benefits over collaborative model learning, including lower dimensionality, parameter-independent communication, and adaptive optimization. We empirically validate the effectiveness of our framework and compare its performance with traditional federated learning approaches through benchmarking experiments.

Cite

Text

Singh et al. "Co-Dream: Collaborative Data Synthesis with Decentralized Models." ICML 2023 Workshops: LLW, 2023.

Markdown

[Singh et al. "Co-Dream: Collaborative Data Synthesis with Decentralized Models." ICML 2023 Workshops: LLW, 2023.](https://mlanthology.org/icmlw/2023/singh2023icmlw-codream/)

BibTeX

@inproceedings{singh2023icmlw-codream,
  title     = {{Co-Dream: Collaborative Data Synthesis with Decentralized Models}},
  author    = {Singh, Abhishek and Gupta, Gauri and Lu, Charles and Koirala, Yogesh and Shankar, Sheshank and Ehab, Mohammed and Raskar, Ramesh},
  booktitle = {ICML 2023 Workshops: LLW},
  year      = {2023},
  url       = {https://mlanthology.org/icmlw/2023/singh2023icmlw-codream/}
}