Memory-Reduced Meta-Learning with Guaranteed Convergence
Abstract
The optimization-based meta-learning approach is gaining increased traction because of its unique ability to quickly adapt to a new task using only small amounts of data. However, existing optimization-based meta-learning approaches, such as MAML, ANIL and their variants, generally employ backpropagation for upper-level gradient estimation, which requires using historical lower-level parameters/gradients and thus increases computational and memory overhead in each iteration. In this paper, we propose a meta-learning algorithm that can avoid using historical parameters/gradients and significantly reduce memory costs in each iteration compared to existing optimization-based meta-learning approaches. In addition to memory reduction, we prove that our proposed algorithm converges sublinearly with the iteration number of upper-level optimization, and the convergence error decays sublinearly with the batch size of sampled tasks. In the specific case in terms of deterministic meta-learning, we also prove that our proposed algorithm converges to an exact solution. Moreover, we quantify the computational complexity of the algorithm, which matches existing convergence results on meta-learning even without using any historical parameters/gradients. Experimental results on meta-learning benchmarks confirm the efficacy of our proposed algorithm.
Cite
Text
Yang et al. "Memory-Reduced Meta-Learning with Guaranteed Convergence." AAAI Conference on Artificial Intelligence, 2025. doi:10.1609/AAAI.V39I20.35501Markdown
[Yang et al. "Memory-Reduced Meta-Learning with Guaranteed Convergence." AAAI Conference on Artificial Intelligence, 2025.](https://mlanthology.org/aaai/2025/yang2025aaai-memory/) doi:10.1609/AAAI.V39I20.35501BibTeX
@inproceedings{yang2025aaai-memory,
title = {{Memory-Reduced Meta-Learning with Guaranteed Convergence}},
author = {Yang, Honglin and Ma, Ji and Yu, Xiao},
booktitle = {AAAI Conference on Artificial Intelligence},
year = {2025},
pages = {21938-21946},
doi = {10.1609/AAAI.V39I20.35501},
url = {https://mlanthology.org/aaai/2025/yang2025aaai-memory/}
}