CoNNect: Connectivity-Based Regularization for Structural Pruning of Neural Networks
Abstract
Pruning encompasses a range of techniques aimed at increasing the sparsity of neural networks (NNs). These techniques can generally be framed as minimizing a loss function subject to an $L_0$ norm constraint. This paper introduces CoNNect, a novel differentiable regularizer for sparse NN training that ensures connectivity between input and output layers. We prove that CoNNect approximates $L_0$ regularization, while preserving essential network structure and preventing the emergence of fragmented or poorly connected subnetworks. Moreover, CoNNect is easily integrated within established structural pruning strategies. Numerical experiments demonstrate that CoNNect can improve classical pruning strategies and enhance state-of-the-art one-shot pruners, such as DepGraph and LLM-pruner.
Cite
Text
Franssen et al. "CoNNect: Connectivity-Based Regularization for Structural Pruning of Neural Networks." Transactions on Machine Learning Research, 2025.Markdown
[Franssen et al. "CoNNect: Connectivity-Based Regularization for Structural Pruning of Neural Networks." Transactions on Machine Learning Research, 2025.](https://mlanthology.org/tmlr/2025/franssen2025tmlr-connect/)BibTeX
@article{franssen2025tmlr-connect,
title = {{CoNNect: Connectivity-Based Regularization for Structural Pruning of Neural Networks}},
author = {Franssen, Christian P.C. and Jiang, Jinyang and Peng, Yijie and Heidergott, Bernd},
journal = {Transactions on Machine Learning Research},
year = {2025},
url = {https://mlanthology.org/tmlr/2025/franssen2025tmlr-connect/}
}