Tiny Updater: Towards Efficient Neural Network-Driven Software Updating

Abstract

Significant advancements have been accomplished with deep neural networks in diverse visual tasks, which have substantially elevated their deployment in edge device software. However, during the update of neural network-based software, users are required to download all the parameters of the neural network anew, which harms the user experience. Motivated by previous progress in model compression, we propose a novel training methodology named Tiny Updater to address this issue. Specifically, by adopting the variant of pruning and knowledge distillation methods, Tiny Updater can update the neural network-based software by only downloading a few parameters (10% 20%) instead of all the parameters in the neural network. Experiments on eleven datasets of three tasks, including image classification, image-to-image translation, and video recognition have demonstrated its effectiveness. Codes have been released in https://github.com/ArchipLab-LinfengZhang/TinyUpdater.

Cite

Text

Zhang and Ma. "Tiny Updater: Towards Efficient Neural Network-Driven Software Updating." International Conference on Computer Vision, 2023. doi:10.1109/ICCV51070.2023.02143

Markdown

[Zhang and Ma. "Tiny Updater: Towards Efficient Neural Network-Driven Software Updating." International Conference on Computer Vision, 2023.](https://mlanthology.org/iccv/2023/zhang2023iccv-tiny/) doi:10.1109/ICCV51070.2023.02143

BibTeX

@inproceedings{zhang2023iccv-tiny,
  title     = {{Tiny Updater: Towards Efficient Neural Network-Driven Software Updating}},
  author    = {Zhang, Linfeng and Ma, Kaisheng},
  booktitle = {International Conference on Computer Vision},
  year      = {2023},
  pages     = {23447-23459},
  doi       = {10.1109/ICCV51070.2023.02143},
  url       = {https://mlanthology.org/iccv/2023/zhang2023iccv-tiny/}
}