Restricted Random Pruning at Initialization for High Compression Range
Abstract
Pruning at Initialization (PaI) makes training overparameterized neural networks more efficient by reducing the overall computational cost from training to inference. Recent PaI studies showed that random pruning is more effective than ranking-based pruning, which learns connectivity. However, the effectiveness of each pruning method depends on the existence of skip connections and the compression ratio (the before-after pruning parameter ratio). While random pruning performs better than ranking-based pruning on architectures with skip connections, the superiority without skip connections is reversed in the high compression range. This paper proposes Minimum Connection Assurance (MiCA) that achieves higher accuracy than conventional PaI methods for architectures with and without skip connections, regardless of the compression ratio. MiCA preserves the random connection between the layers and maintains the performance at high compression ratios without the costly connection learning that ranking-based pruning requires. Experiments on image classification using CIFAR-10 and CIFAR-100 and node classification using OGBN-ArXiv show that MiCA enhances the compression ratio and accuracy trade-offs compared to existing PaI methods. In VGG-16 with CIFAR-10, MiCA improves the accuracy of random pruning by $27.0\%$ at $10^{4.7}\times$ compression ratio. Furthermore, experimental analysis reveals that increasing the utilization of the nodes through which information flows from the first layer is essential for maintaining high performance at a high compression ratio.
Cite
Text
Otsuka et al. "Restricted Random Pruning at Initialization for High Compression Range." Transactions on Machine Learning Research, 2024.Markdown
[Otsuka et al. "Restricted Random Pruning at Initialization for High Compression Range." Transactions on Machine Learning Research, 2024.](https://mlanthology.org/tmlr/2024/otsuka2024tmlr-restricted/)BibTeX
@article{otsuka2024tmlr-restricted,
title = {{Restricted Random Pruning at Initialization for High Compression Range}},
author = {Otsuka, Hikari and Okoshi, Yasuyuki and García-Arias, Ángel López and Kawamura, Kazushi and Van Chu, Thiem and Fujiki, Daichi and Motomura, Masato},
journal = {Transactions on Machine Learning Research},
year = {2024},
url = {https://mlanthology.org/tmlr/2024/otsuka2024tmlr-restricted/}
}