Treffer: The committee machine: Computational to statistical gaps in learning a two-layers neural network

Title:
The committee machine: Computational to statistical gaps in learning a two-layers neural network
Source:
WoS
Publisher Information:
NEURAL INFORMATION PROCESSING SYSTEMS (NIPS)
La Jolla
Publication Year:
2019
Collection:
Ecole Polytechnique Fédérale Lausanne (EPFL): Infoscience
Document Type:
Konferenz conference object
Language:
unknown
ISSN:
1049-5258
Relation:
Advances In Neural Information Processing Systems 31 (Nips 2018); Advances in Neural Information Processing Systems; 32nd Conference on Neural Information Processing Systems (NIPS); https://infoscience.epfl.ch/handle/20.500.14299/158004; WOS:000461823303024
Accession Number:
edsbas.D8F05C21
Database:
BASE

Weitere Informationen

Heuristic tools from statistical physics have been used in the past to locate the phase transitions and compute the optimal learning and generalization errors in the teacher-student scenario in multi-layer neural networks. In this contribution, we provide a rigorous justification of these approaches for a two-layers neural network model called the committee machine. We also introduce a version of the approximate message passing (AMP) algorithm for the committee machine that allows to perform optimal learning in polynomial time for a large set of parameters. We find that there are regimes in which a low generalization error is information-theoretically achievable while the AMP algorithm fails to deliver it; strongly suggesting that no efficient algorithm exists for those cases, and unveiling a large computational gap. ; LTHC