TIPS: Topologically Important Path Sampling for Anytime Neural Networks

ICML 2023 pp. 19343-19359

Abstract

Anytime neural networks (AnytimeNNs) are a promising solution to adaptively adjust the model complexity at runtime under various hardware resource constraints. However, the manually-designed AnytimeNNs are biased by designers’ prior experience and thus provide sub-optimal solutions. To address the limitations of existing hand-crafted approaches, we first model the training process of AnytimeNNs as a discrete-time Markov chain (DTMC) and use it to identify the paths that contribute the most to the training of AnytimeNNs. Based on this new DTMC-based analysis, we further propose TIPS, a framework to automatically design AnytimeNNs under various hardware constraints. Our experimental results show that TIPS can improve the convergence rate and test accuracy of AnytimeNNs. Compared to the existing AnytimeNNs approaches, TIPS improves the accuracy by 2%-6.6% on multiple datasets and achieves SOTA accuracy-FLOPs tradeoffs.

Cite

Text

Li et al. "TIPS: Topologically Important Path Sampling for Anytime Neural Networks." International Conference on Machine Learning, 2023.

Markdown

[Li et al. "TIPS: Topologically Important Path Sampling for Anytime Neural Networks." International Conference on Machine Learning, 2023.](https://mlanthology.org/icml/2023/li2023icml-tips/)

BibTeX

@inproceedings{li2023icml-tips,
  title     = {{TIPS: Topologically Important Path Sampling for Anytime Neural Networks}},
  author    = {Li, Guihong and Bhardwaj, Kartikeya and Yang, Yuedong and Marculescu, Radu},
  booktitle = {International Conference on Machine Learning},
  year      = {2023},
  pages     = {19343-19359},
  volume    = {202},
  url       = {https://mlanthology.org/icml/2023/li2023icml-tips/}
}