2019
DOI: 10.48550/arxiv.1912.02729
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Rademacher complexity and spin glasses: A link between the replica and statistical theories of learning

Abstract: Statistical learning theory provides bounds of the generalization gap, using in particular the Vapnik-Chervonenkis dimension and the Rademacher complexity. An alternative approach, mainly studied in the statistical physics literature, is the study of generalization in simple synthetic-data models. Here we discuss the connections between these approaches and focus on the link between the Rademacher complexity in statistical learning and the theories of generalization for typical-case synthetic models from stati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2020
2020
2020
2020

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 35 publications
0
3
0
Order By: Relevance
“…This is much worse that what we observe in practice, where we reach the Bayes rate e g = Θ(α −1 ). Tighter bounds can be obtained using the Rademacher complexity, and this was studied recently (using the aforementioned replica method) in [49] for the very same problem. We reproduced their results and plotted the Rademacher complexity generalization bound in Fig.…”
Section: Generalization Errorsmentioning
confidence: 99%
“…This is much worse that what we observe in practice, where we reach the Bayes rate e g = Θ(α −1 ). Tighter bounds can be obtained using the Rademacher complexity, and this was studied recently (using the aforementioned replica method) in [49] for the very same problem. We reproduced their results and plotted the Rademacher complexity generalization bound in Fig.…”
Section: Generalization Errorsmentioning
confidence: 99%
“…Indeed, one of the main pursuits of contemporary SLT is to provide better results on the generalization error, going beyond distribution independent bounds. Several strategies have been proposed, advocating the importance of considering datadependent hypothesis classes [25] and data-dependent measures of complexity (such as the Rademacher complexity [36], which was recently connected to the statistical mechanics of disordered systems [37]), also in relation to the original concept of VC entropy itself [38].…”
Section: Taking Data Structure Into Account In Statistical Learning T...mentioning
confidence: 99%
“…This approximation has negligible effects in the large-n limit [31]. Equation (37) fixes the recurrence relation satisfied by the generating function g n (z):…”
Section: Non-monotonicity Of the Vc Entropymentioning
confidence: 99%