Rethinking Decentralized Learning: Towards More Realistic Evaluations with a Metadata-Agnostic Approach
Abstract
Decentralized learning has been regarded as a privacy-preserving training paradigm that enables distributed model training without exposing raw data. However, many experimental settings in decentralized learning research assume metadata awareness among participants, which contradicts real-world constraints where participants lack shared metadata knowledge. We distinguish between Metadata-Dependent Supervised Learning (MDSL), which assumes global metadata synchronization, and Metadata-Agnostic Zero-Shot Learning (MAZEL), where participants do not share metadata. Our contributions are (1) highlight the difference between MAZEL and MDSL; (2) present empirical evidence demonstrating that long-held claims of MDSL-based decentralized learning may not hold under MAZEL settings; (3) provide benchmarks using up to 8–16 diverse datasets to rigorously evaluate newly proposed decentralized methods under real metadata-agnostic cases; and (4) propose two-stage and cosine gossip schedulers to optimize communication efficiency. Our code is available at: https://anonymous.4open.science/r/More-Realistic-Evaluations.
Cite
Text
Zhang et al. "Rethinking Decentralized Learning: Towards More Realistic Evaluations with a Metadata-Agnostic Approach." ICLR 2025 Workshops: MCDC, 2025.Markdown
[Zhang et al. "Rethinking Decentralized Learning: Towards More Realistic Evaluations with a Metadata-Agnostic Approach." ICLR 2025 Workshops: MCDC, 2025.](https://mlanthology.org/iclrw/2025/zhang2025iclrw-rethinking/)BibTeX
@inproceedings{zhang2025iclrw-rethinking,
title = {{Rethinking Decentralized Learning: Towards More Realistic Evaluations with a Metadata-Agnostic Approach}},
author = {Zhang, Tianyu and Li, Lu and Zhu, Tongtian and Wang, Suyuchen and Wang, Can and Chen, Yong},
booktitle = {ICLR 2025 Workshops: MCDC},
year = {2025},
url = {https://mlanthology.org/iclrw/2025/zhang2025iclrw-rethinking/}
}