MetaOptimize: A Framework for Optimizing Step Sizes and Other Meta-Parameters
Abstract
We address the challenge of optimizing meta-parameters (hyperparameters) in machine learning, a key factor for efficient training and high model performance. Rather than relying on expensive meta-parameter search methods, we introduce MetaOptimize: a dynamic approach that adjusts meta-parameters, particularly step sizes (also known as learning rates), during training. More specifically, MetaOptimize can wrap around any first-order optimization algorithm, tuning step sizes on the fly to minimize a specific form of regret that considers the long-term impact of step sizes on training, through a discounted sum of future losses. We also introduce lower-complexity variants of MetaOptimize that, in conjunction with its adaptability to various optimization algorithms, achieve performance comparable to those of the best hand-crafted learning rate schedules across diverse machine learning tasks.
Cite
Text
Sharifnassab et al. "MetaOptimize: A Framework for Optimizing Step Sizes and Other Meta-Parameters." Proceedings of the 42nd International Conference on Machine Learning, 2025.Markdown
[Sharifnassab et al. "MetaOptimize: A Framework for Optimizing Step Sizes and Other Meta-Parameters." Proceedings of the 42nd International Conference on Machine Learning, 2025.](https://mlanthology.org/icml/2025/sharifnassab2025icml-metaoptimize/)BibTeX
@inproceedings{sharifnassab2025icml-metaoptimize,
title = {{MetaOptimize: A Framework for Optimizing Step Sizes and Other Meta-Parameters}},
author = {Sharifnassab, Arsalan and Salehkaleybar, Saber and Sutton, Richard S.},
booktitle = {Proceedings of the 42nd International Conference on Machine Learning},
year = {2025},
pages = {54300-54325},
volume = {267},
url = {https://mlanthology.org/icml/2025/sharifnassab2025icml-metaoptimize/}
}