Co-Dream: Collaborative Dream Synthesis over Decentralized Models
Abstract
Federated Learning (FL) has pioneered the idea of "share wisdom not raw data" to enable collaborative learning over decentralized data. FL achieves this goal by averaging model parameters instead of centralizing data. However, representing "wisdom" in the form of model parameters has its own limitations including the requirement for uniform model architectures across clients and communication overhead proportional to model size. In this work we introduce Co-Dream a framework for representing "wisdom" in data space instead of model parameters. Here, clients collaboratively optimize random inputs based on their locally trained models and aggregate gradients of their inputs. Our proposed approach overcomes the aforementioned limitations and comes with additional benefits such as adaptive optimization and interpretable representation of knowledge. We empirically demonstrate the effectiveness of Co-Dream and compare its performance with existing techniques.
Cite
Text
Singh et al. "Co-Dream: Collaborative Dream Synthesis over Decentralized Models." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I19.34258Markdown
[Singh et al. "Co-Dream: Collaborative Dream Synthesis over Decentralized Models." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/singh2025aaai-co/) doi:10.1609/AAAI.V39I19.34258BibTeX
@inproceedings{singh2025aaai-co,
title = {{Co-Dream: Collaborative Dream Synthesis over Decentralized Models}},
author = {Singh, Abhishek and Gupta, Gauri and Shi, Yichuan and Dang, Alex and Kapila, Ritvik and Shankar, Sheshank and Ehab, Mohammed and Raskar, Ramesh},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2025},
pages = {20497-20505},
doi = {10.1609/AAAI.V39I19.34258},
url = {https://mlanthology.org/aaai/2025/singh2025aaai-co/}
}