Sample Compression Hypernetworks: From Generalization Bounds to Meta-Learning

Abstract

Reconstruction functions are pivotal in sample compression theory, a framework for deriving tight generalization bounds. From a small sample of the training set (the compression set) and an optional stream of information (the message), they recover a predictor previously learned from the whole training set. While usually fixed, we propose to learn reconstruction functions. To facilitate the optimization and increase the expressiveness of the message, we derive a new sample compression generalization bound for real-valued messages. From this theoretical analysis, we then present a new hypernetwork architecture that outputs predictors with tight generalization guarantees when trained using an original meta-learning framework. The results of promising preliminary experiments are then reported.

Cite

Text

Leblanc et al. "Sample Compression Hypernetworks: From Generalization Bounds to Meta-Learning." NeurIPS 2024 Workshops: Compression, 2024.

Markdown

[Leblanc et al. "Sample Compression Hypernetworks: From Generalization Bounds to Meta-Learning." NeurIPS 2024 Workshops: Compression, 2024.](https://mlanthology.org/neuripsw/2024/leblanc2024neuripsw-sample/)

BibTeX

@inproceedings{leblanc2024neuripsw-sample,
  title     = {{Sample Compression Hypernetworks: From Generalization Bounds to Meta-Learning}},
  author    = {Leblanc, Benjamin and Bazinet, Mathieu and D'Amours, Nathaniel and Drouin, Alexandre and Germain, Pascal},
  booktitle = {NeurIPS 2024 Workshops: Compression},
  year      = {2024},
  url       = {https://mlanthology.org/neuripsw/2024/leblanc2024neuripsw-sample/}
}