Rope-Net: Deep Convolutional Neural Network via Robust Principal Component Analysis
Abstract
Low-rank decomposition methods can compress parameters and accelerate training of deep convolutional neural networks (DCNNs) by mining the low-rankness in weights. However, for an image classification task, the test accuracy of DCNNs compressed by low-rank decomposition usually decreases. Our experiments show that the error term of low-rank decomposition is not sparse and not suitable for reutilizing to maintain accuracy. To overcome this problem, in this paper, we propose an effective compression approach of DCNNs by robust principal component analysis (RPCA) on the weights of convolutional and fully connected layers. Low-rank term of RPCA keeps advantages of low-rank decomposition. Sparse term of RPCA can be retrained via sparse learning to improve test accuracy because of its sparsity. We name this kind of DCNN as Rope-Net (DCNN via Robust Principal Component Analysis). Our Rope-ResNet110 (ResNet110 via RPCA) experiment on CIFAR10 demonstrates that Rope-ResNet110 can achieve: a) 3.3 $\times$ × parameter compression ratio and 0.33% accuracy improvement, b) 5 $\times$ × parameter compression ratio without accuracy drop. More remarkably, Rope-VGG16 achieves 8.5 $\times$ × parameter compression ratio and 5.48% accuracy improvement on randomly cutout CIFAR10 test set compared with VGG16, which validates strong robustness of Rope-Net.
Cite
Text
Liu et al. "Rope-Net: Deep Convolutional Neural Network via Robust Principal Component Analysis." Machine Learning, 2025. doi:10.1007/S10994-025-06782-5Markdown
[Liu et al. "Rope-Net: Deep Convolutional Neural Network via Robust Principal Component Analysis." Machine Learning, 2025.](https://mlanthology.org/mlj/2025/liu2025mlj-ropenet/) doi:10.1007/S10994-025-06782-5BibTeX
@article{liu2025mlj-ropenet,
title = {{Rope-Net: Deep Convolutional Neural Network via Robust Principal Component Analysis}},
author = {Liu, Baichen and Han, Zhi and Chen, Xi'ai and Wang, Yanmei and Tang, Yandong},
journal = {Machine Learning},
year = {2025},
pages = {150},
doi = {10.1007/S10994-025-06782-5},
volume = {114},
url = {https://mlanthology.org/mlj/2025/liu2025mlj-ropenet/}
}