Meta-Learning Neural Bloom Filters
Abstract
There has been a recent trend in training neural networks to replace data structures that have been crafted by hand, with an aim for faster execution, better accuracy, or greater compression. In this setting, a neural data structure is instantiated by training a network over many epochs of its inputs until convergence. In applications where inputs arrive at high throughput, or are ephemeral, training a network from scratch is not practical. This motivates the need for few-shot neural data structures. In this paper we explore the learning of approximate set membership over a set of data in one-shot via meta-learning. We propose a novel memory architecture, the Neural Bloom Filter, which is able to achieve significant compression gains over classical Bloom Filters and existing memory-augmented neural networks.
Cite
Text
Rae et al. "Meta-Learning Neural Bloom Filters." International Conference on Machine Learning, 2019.Markdown
[Rae et al. "Meta-Learning Neural Bloom Filters." International Conference on Machine Learning, 2019.](https://mlanthology.org/icml/2019/rae2019icml-metalearning/)BibTeX
@inproceedings{rae2019icml-metalearning,
title = {{Meta-Learning Neural Bloom Filters}},
author = {Rae, Jack and Bartunov, Sergey and Lillicrap, Timothy},
booktitle = {International Conference on Machine Learning},
year = {2019},
pages = {5271-5280},
volume = {97},
url = {https://mlanthology.org/icml/2019/rae2019icml-metalearning/}
}