Learning to Learn Kernels with Variational Random Features
Abstract
We introduce kernels with random Fourier features in the meta-learning framework for few-shot learning. We propose meta variational random features (MetaVRF) to learn adaptive kernels for the base-learner, which is developed in a latent variable model by treating the random feature basis as the latent variable. We formulate the optimization of MetaVRF as a variational inference problem by deriving an evidence lower bound under the meta-learning framework. To incorporate shared knowledge from related tasks, we propose a context inference of the posterior, which is established by an LSTM architecture. The LSTM-based inference network can effectively integrate the context information of previous tasks with task-specific information, generating informative and adaptive features. The learned MetaVRF can produce kernels of high representational power with a relatively low spectral sampling rate and also enables fast adaptation to new tasks. Experimental results on a variety of few-shot regression and classification tasks demonstrate that MetaVRF delivers much better, or at least competitive, performance compared to existing meta-learning alternatives.
Cite
Text
Zhen et al. "Learning to Learn Kernels with Variational Random Features." International Conference on Machine Learning, 2020.Markdown
[Zhen et al. "Learning to Learn Kernels with Variational Random Features." International Conference on Machine Learning, 2020.](https://mlanthology.org/icml/2020/zhen2020icml-learning/)BibTeX
@inproceedings{zhen2020icml-learning,
title = {{Learning to Learn Kernels with Variational Random Features}},
author = {Zhen, Xiantong and Sun, Haoliang and Du, Yingjun and Xu, Jun and Yin, Yilong and Shao, Ling and Snoek, Cees},
booktitle = {International Conference on Machine Learning},
year = {2020},
pages = {11409-11419},
volume = {119},
url = {https://mlanthology.org/icml/2020/zhen2020icml-learning/}
}