Minimal Achievable Sufficient Statistic Learning

Abstract

We introduce Minimal Achievable Sufficient Statistic (MASS) Learning, a machine learning training objective for which the minima are minimal sufficient statistics with respect to a class of functions being optimized over (e.g., deep networks). In deriving MASS Learning, we also introduce Conserved Differential Information (CDI), an information-theoretic quantity that — unlike standard mutual information — can be usefully applied to deterministically-dependent continuous random variables like the input and output of a deep network. In a series of experiments, we show that deep networks trained with MASS Learning achieve competitive performance on supervised learning, regularization, and uncertainty quantification benchmarks.

Cite

Text

Cvitkovic and Koliander. "Minimal Achievable Sufficient Statistic Learning." International Conference on Machine Learning, 2019.

Markdown

[Cvitkovic and Koliander. "Minimal Achievable Sufficient Statistic Learning." International Conference on Machine Learning, 2019.](https://mlanthology.org/icml/2019/cvitkovic2019icml-minimal/)

BibTeX

@inproceedings{cvitkovic2019icml-minimal,
  title     = {{Minimal Achievable Sufficient Statistic Learning}},
  author    = {Cvitkovic, Milan and Koliander, Günther},
  booktitle = {International Conference on Machine Learning},
  year      = {2019},
  pages     = {1465-1474},
  volume    = {97},
  url       = {https://mlanthology.org/icml/2019/cvitkovic2019icml-minimal/}
}