Applications of Fractional Calculus in Learned Optimization
Abstract
The problem of fractional gradient descent has been studied extensively, focusing on its ability to extend traditional gradient descent methods by incorporating fractional-order derivatives. This approach allows for more flexibility in navigating complex optimization landscapes and may offer advantages in certain types of problems, particularly those involving nonlinearity and chaotic dynamics. Yet, the challenge of fine-tuning the fractional order parameters remains unresolved. In this work, we demonstrate that it is possible to train a neural network to predict the order of the gradient effectively.
Cite
Text
Szente et al. "Applications of Fractional Calculus in Learned Optimization." NeurIPS 2024 Workshops: OPT, 2024.Markdown
[Szente et al. "Applications of Fractional Calculus in Learned Optimization." NeurIPS 2024 Workshops: OPT, 2024.](https://mlanthology.org/neuripsw/2024/szente2024neuripsw-applications/)BibTeX
@inproceedings{szente2024neuripsw-applications,
title = {{Applications of Fractional Calculus in Learned Optimization}},
author = {Szente, Teodor Alexandru and Harrison, James and Zanfir, Mihai and Sminchisescu, Cristian},
booktitle = {NeurIPS 2024 Workshops: OPT},
year = {2024},
url = {https://mlanthology.org/neuripsw/2024/szente2024neuripsw-applications/}
}