2021
DOI: 10.48550/arxiv.2107.01163
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Unveiling the structure of wide flat minima in neural networks

Carlo Baldassi,
Clarissa Lauditi,
Enrico M. Malatesta
et al.
Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
5
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(6 citation statements)
references
References 22 publications
1
5
0
Order By: Relevance
“…3. The overall picture closely resembles that of simpler models [19], and we point out some noteworthy results:…”
Section: πœ™ = Limsupporting
confidence: 69%
See 3 more Smart Citations
“…3. The overall picture closely resembles that of simpler models [19], and we point out some noteworthy results:…”
Section: πœ™ = Limsupporting
confidence: 69%
“…Phase diagram. Before diving into analytical details, we anticipate how the geometry of the space of solutions changes < l a t e x i t s h a 1 _ b a s e 6 4 = " w z For 𝛼 𝑇 β†’ 0 we recover the critical capacity [18] and the local entropy transition [19] of the standard binary perceptron. In the inset we show the training error of SA, BP and SBPI versus the degree of overparameterization 1/𝛼 for 𝐷 = 201 and 𝛼 𝑇 = 3.…”
Section: πœ™ = Limmentioning
confidence: 99%
See 2 more Smart Citations
“…This phenomenon has been justified numerically by the existence of subdominant and dense connected regions of solutions [Bal+15]. We also refer to [Bal+21] for heuristic descriptions of the solution space.…”
Section: Introductionmentioning
confidence: 97%