$C^2M^3$: Cycle-Consistent Multi-Model Merging

Abstract

In this paper, we present a novel data-free method for merging neural networks in weight space. Our method optimizes for the permutations of network neurons while ensuring global coherence across all layers, and it outperforms recent layer-local approaches in a set of challenging scenarios. We then generalize the formulation to the $N$-models scenario to enforce cycle consistency of the permutations with guarantees, allowing circular compositions of permutations to be computed without accumulating error along the path. We qualitatively and quantitatively motivate the need for such a constraint, showing its benefits when merging homogeneous sets of models in scenarios spanning varying architectures and datasets. We finally show that, when coupled with activation renormalization, the approach yields the best results in the task.

Cite

Text

Crisostomi et al. "$C^2M^3$: Cycle-Consistent Multi-Model Merging." Neural Information Processing Systems, 2024. doi:10.52202/079017-0901

Markdown

[Crisostomi et al. "$C^2M^3$: Cycle-Consistent Multi-Model Merging." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/crisostomi2024neurips-2m/) doi:10.52202/079017-0901

BibTeX

@inproceedings{crisostomi2024neurips-2m,
  title     = {{$C^2M^3$: Cycle-Consistent Multi-Model Merging}},
  author    = {Crisostomi, Donato and Fumero, Marco and Baieri, Daniele and Bernard, Florian and Rodolà, Emanuele},
  booktitle = {Neural Information Processing Systems},
  year      = {2024},
  doi       = {10.52202/079017-0901},
  url       = {https://mlanthology.org/neurips/2024/crisostomi2024neurips-2m/}
}