$\mathcal{D}^2$-Sparse: Navigating the Low Data Learning Regime with Coupled Sparse Networks

Abstract

Research within the realm of deep learning has extensively delved into learning under diverse constraints, with the incorporation of sparsity as a pragmatic constraint playing a pivotal role in enhancing the efficiency of deep learning. This paper introduces a novel approach, termed $\mathcal{D}^2$-Sparse, presenting a dual dynamic sparse learning system tailored for scenarios involving limited data. In contrast to conventional studies that independently investigate sparsity and low-data learning, our research amalgamates these constraints, paving the way for new avenues in sparsity-related investigations. $\mathcal{D}^2$-Sparse outperforms typical iterative pruning methods when applied to standard deep networks, particularly excelling in tasks like image classification within the domain of computer vision. In particular, it achieves a notable 5\% improvement in top-1 accuracy for ResNet-34 in the CIFAR-10 classification task, with only 5000 samples compared to iterative pruning methods.

Cite

Text

Misra et al. "$\mathcal{D}^2$-Sparse: Navigating the Low Data Learning Regime with Coupled Sparse Networks." ICLR 2024 Workshops: PML4LRS, 2024.

Markdown

[Misra et al. "$\mathcal{D}^2$-Sparse: Navigating the Low Data Learning Regime with Coupled Sparse Networks." ICLR 2024 Workshops: PML4LRS, 2024.](https://mlanthology.org/iclrw/2024/misra2024iclrw-2sparse/)

BibTeX

@inproceedings{misra2024iclrw-2sparse,
  title     = {{$\mathcal{D}^2$-Sparse: Navigating the Low Data Learning Regime with Coupled Sparse Networks}},
  author    = {Misra, Diganta and Nolte, Niklas and Mishra, Sparsha and Yin, Lu},
  booktitle = {ICLR 2024 Workshops: PML4LRS},
  year      = {2024},
  url       = {https://mlanthology.org/iclrw/2024/misra2024iclrw-2sparse/}
}