The Road Less Scheduled
Abstract
Existing learning rate schedules that do not require specification of the optimization stopping step $T$ are greatly out-performed by learning rate schedules that depend on $T$. We propose an approach that avoids the need for this stopping time by eschewing the use of schedules entirely, while exhibiting state-of-the-art performance compared to schedules across a wide family of problems ranging from convex problems to large-scale deep learning problems. Our Schedule-Free approach introduces no additional hyper-parameters over standard optimizers with momentum. Our method is a direct consequence of a new theory we develop that unifies scheduling and iterate averaging. An open source implementation of our method is available at https://github.com/facebookresearch/schedule_free. Schedule-Free AdamW is the core algorithm behind our winning entry to the MLCommons 2024 AlgoPerf Algorithmic Efficiency Challenge Self-Tuning track.
Cite
Text
Defazio et al. "The Road Less Scheduled." Neural Information Processing Systems, 2024. doi:10.52202/079017-0320Markdown
[Defazio et al. "The Road Less Scheduled." Neural Information Processing Systems, 2024.](https://mlanthology.org/neurips/2024/defazio2024neurips-road/) doi:10.52202/079017-0320BibTeX
@inproceedings{defazio2024neurips-road,
title = {{The Road Less Scheduled}},
author = {Defazio, Aaron and Yang, Xingyu and Mehta, Harsh and Mishchenko, Konstantin and Khaled, Ahmed and Cutkosky, Ashok},
booktitle = {Neural Information Processing Systems},
year = {2024},
doi = {10.52202/079017-0320},
url = {https://mlanthology.org/neurips/2024/defazio2024neurips-road/}
}