Towards Optimal Network Depths: Control-Inspired Acceleration of Training and Inference in Neural ODEs
Abstract
Neural Ordinary Differential Equations (ODEs) offer potential for learning continuous dynamics, but their slow training and inference limit broader use. This paper proposes spatial and temporal optimization inspired by control theory. It seeks an optimal network depth to accelerate both training and inference while maintaining performance. Two approaches are presented: one treats training as a single-stage minimum-time optimal control problem, adjusting terminal time, and the other combines pre-training with Lyapunov method, followed by safe terminal time updates in a secondary stage. Experiments confirm the effectiveness of addressing Neural ODEs' speed limitations.
Cite
Text
Miao and Gatsis. "Towards Optimal Network Depths: Control-Inspired Acceleration of Training and Inference in Neural ODEs." NeurIPS 2023 Workshops: DLDE, 2023.Markdown
[Miao and Gatsis. "Towards Optimal Network Depths: Control-Inspired Acceleration of Training and Inference in Neural ODEs." NeurIPS 2023 Workshops: DLDE, 2023.](https://mlanthology.org/neuripsw/2023/miao2023neuripsw-optimal/)BibTeX
@inproceedings{miao2023neuripsw-optimal,
title = {{Towards Optimal Network Depths: Control-Inspired Acceleration of Training and Inference in Neural ODEs}},
author = {Miao, Keyan and Gatsis, Konstantinos},
booktitle = {NeurIPS 2023 Workshops: DLDE},
year = {2023},
url = {https://mlanthology.org/neuripsw/2023/miao2023neuripsw-optimal/}
}