Highly Efficient Salient Object Detection with 100k Parameters

Abstract

Salient object detection models often demand a considerable amount of computation cost to make precise prediction for each pixel, making them hardly applicable on low-power devices. In this paper, we aim to relieve the contradiction between computation cost and model performance by improving the network efficiency to a higher degree. We propose a flexible convolutional module, namely generalized OctConv (gOctConv), to efficiently utilize both in-stage and cross-stages multi-scale features, while reducing the representation redundancy by a novel dynamic weight decay scheme. The effective dynamic weight decay scheme stably boosts the sparsity of parameters during training, supports learnable number of channels for each scale in gOctConv, allowing 80% of parameters reduce with negligible performance drop. Utilizing gOctConv, we build an extremely light-weighted model, namely CSNet, which achieves comparable performance with about 0.2% parameters (100k) of large models on popular salient object detection benchmarks. The source code is publicly available at https://mmcheng.net/sod100k.

Cite

Text

Gao et al. "Highly Efficient Salient Object Detection with 100k Parameters." Proceedings of the European Conference on Computer Vision (ECCV), 2020. doi:10.1007/978-3-030-58539-6_42

Markdown

[Gao et al. "Highly Efficient Salient Object Detection with 100k Parameters." Proceedings of the European Conference on Computer Vision (ECCV), 2020.](https://mlanthology.org/eccv/2020/gao2020eccv-highly/) doi:10.1007/978-3-030-58539-6_42

BibTeX

@inproceedings{gao2020eccv-highly,
  title     = {{Highly Efficient Salient Object Detection with 100k Parameters}},
  author    = {Gao, Shang-Hua and Tan, Yong-Qiang and Cheng, Ming-Ming and Lu, Chengze and Chen, Yunpeng and Yan, Shuicheng},
  booktitle = {Proceedings of the European Conference on Computer Vision (ECCV)},
  year      = {2020},
  doi       = {10.1007/978-3-030-58539-6_42},
  url       = {https://mlanthology.org/eccv/2020/gao2020eccv-highly/}
}