Latent Bottlenecked Attentive Neural Processes
Abstract
Neural Processes (NPs) are popular methods in meta-learning that can estimate predictive uncertainty on target datapoints by conditioning on a context dataset. Previous state-of-the-art method Transformer Neural Processes (TNPs) achieve strong performance but require quadratic computation with respect to the number of context datapoints, significantly limiting its scalability. Conversely, existing sub-quadratic NP variants perform significantly worse than that of TNPs. Tackling this issue, we propose Latent Bottlenecked Attentive Neural Processes (LBANPs), a new computationally efficient sub-quadratic NP variant, that has a querying computational complexity independent of the number of context datapoints. The model encodes the context dataset into a constant number of latent vectors on which self-attention is performed. When making predictions, the model retrieves higher-order information from the context dataset via multiple cross-attention mechanisms on the latent vectors. We empirically show that LBANPs achieve results competitive with the state-of-the-art on meta-regression, image completion, and contextual multi-armed bandits. We demonstrate that LBANPs can trade-off the computational cost and performance according to the number of latent vectors. Finally, we show LBANPs can scale beyond existing attention-based NP variants to larger dataset settings.
Cite
Text
Feng et al. "Latent Bottlenecked Attentive Neural Processes." International Conference on Learning Representations, 2023.Markdown
[Feng et al. "Latent Bottlenecked Attentive Neural Processes." International Conference on Learning Representations, 2023.](https://mlanthology.org/iclr/2023/feng2023iclr-latent/)BibTeX
@inproceedings{feng2023iclr-latent,
title = {{Latent Bottlenecked Attentive Neural Processes}},
author = {Feng, Leo and Hajimirsadeghi, Hossein and Bengio, Yoshua and Ahmed, Mohamed Osama},
booktitle = {International Conference on Learning Representations},
year = {2023},
url = {https://mlanthology.org/iclr/2023/feng2023iclr-latent/}
}