The Committee Machine: Computational to Statistical Gaps in Learning a Two-Layers Neural Network
Abstract
Heuristic tools from statistical physics have been used in the past to compute the optimal learning and generalization errors in the teacher-student scenario in multi- layer neural networks. In this contribution, we provide a rigorous justification of these approaches for a two-layers neural network model called the committee machine. We also introduce a version of the approximate message passing (AMP) algorithm for the committee machine that allows to perform optimal learning in polynomial time for a large set of parameters. We find that there are regimes in which a low generalization error is information-theoretically achievable while the AMP algorithm fails to deliver it; strongly suggesting that no efficient algorithm exists for those cases, and unveiling a large computational gap.
Cite
Text
Aubin et al. "The Committee Machine: Computational to Statistical Gaps in Learning a Two-Layers Neural Network." Neural Information Processing Systems, 2018.Markdown
[Aubin et al. "The Committee Machine: Computational to Statistical Gaps in Learning a Two-Layers Neural Network." Neural Information Processing Systems, 2018.](https://mlanthology.org/neurips/2018/aubin2018neurips-committee/)BibTeX
@inproceedings{aubin2018neurips-committee,
title = {{The Committee Machine: Computational to Statistical Gaps in Learning a Two-Layers Neural Network}},
author = {Aubin, Benjamin and Maillard, Antoine and Barbier, Jean and Krzakala, Florent and Macris, Nicolas and Zdeborová, Lenka},
booktitle = {Neural Information Processing Systems},
year = {2018},
pages = {3223-3234},
url = {https://mlanthology.org/neurips/2018/aubin2018neurips-committee/}
}