2021
DOI: 10.48550/arxiv.2104.00526
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Fitting Elephants

Abstract: Textbook wisdom advocates for smooth function fits and implies that interpolation of noisy data should lead to poor generalization. A related heuristic is that fitting parameters should be fewer than measurements (Occam's Razor). Surprisingly, contemporary machine learning (ML) approaches, cf. deep nets (DNNs), generalize well despite interpolating noisy data. This may be understood via Statistically Consistent Interpolation (SCI), i.e. data interpolation techniques that generalize optimally for big data. In t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 31 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?