2021
DOI: 10.48550/arxiv.2106.09798
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Wide stochastic networks: Gaussian limit and PAC-Bayesian training

Abstract: The limit of infinite width allows for substantial simplifications in the analytical study of overparameterized neural networks. With a suitable random initialization, an extremely large network is well approximated by a Gaussian process, both before and during training. In the present work, we establish a similar result for a simple stochastic architecture whose parameters are random variables. The explicit evaluation of the output distribution allows for a PAC-Bayesian training procedure that directly optimi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 10 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?