2022
DOI: 10.1088/1742-5468/ac9829
|View full text |Cite
|
Sign up to set email alerts
|

Generalization error rates in kernel regression: the crossover from the noiseless to noisy regime*

Abstract: In this manuscript we consider kernel ridge regression (KRR) under the Gaussian design. Exponents for the decay of the excess generalization error of KRR have been reported in various works under the assumption of power-law decay of eigenvalues of the features co-variance. These decays were, however, provided for sizeably different setups, namely in the noiseless case with constant regularization and in the noisy optimally regularized case. Intermediary settings have been left substantially uncharted. In this … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 42 publications
0
5
0
Order By: Relevance
“…Methods of statistical physics have traditionally been the tool of choice for obtaining closed-form formulae in this setting (Engel, von Guericke, and den Broeck 2012). Recent works have provided analytical expressions for the generalisation error of high-dimensional kernel regressions (Canatar, Bordelon, and Pehlevan 2021;Bordelon, Canatar, and Pehlevan 2020;Jacot et al 2020;Simon et al 2022;Cui et al 2022). In particular, (Jacot et al 2020) and (Simon et al 2022) rely on the spectral universality assumption, just as we do to estimate the coefficients in our formula.…”
Section: Related Workmentioning
confidence: 99%
“…Methods of statistical physics have traditionally been the tool of choice for obtaining closed-form formulae in this setting (Engel, von Guericke, and den Broeck 2012). Recent works have provided analytical expressions for the generalisation error of high-dimensional kernel regressions (Canatar, Bordelon, and Pehlevan 2021;Bordelon, Canatar, and Pehlevan 2020;Jacot et al 2020;Simon et al 2022;Cui et al 2022). In particular, (Jacot et al 2020) and (Simon et al 2022) rely on the spectral universality assumption, just as we do to estimate the coefficients in our formula.…”
Section: Related Workmentioning
confidence: 99%
“…A few recent works, appearing after the completion of this manuscript, also investigate the scaling of test error in related settings. Cui et al ( 24 ) study the decay of test error with dataset size for kernel regression in a high-dimensional limit with Gaussian design. Maloney et al ( 25 ) examine further a teacher-student framework similar to ours, deriving joint scaling laws using techniques from random matrix theory.…”
Section: Scaling Laws For Deep Neural Networkmentioning
confidence: 99%
“…It can then be shown, if our formulas apply, that κ(0) ∝ 1 n α . See [11] for a detailed analysis of the consequences of the ridge regression asymptotic equivalents when such assumptions are made.…”
Section: Isotropic Covariance Matricesmentioning
confidence: 99%
“…• We consider in Section 4 the ridge regression estimator and re-interpret the results of [14,31,11,36,5] using classical notions from non-parametric statistics, namely the degrees of freedom, a.k.a. effective dimensionality [38,8].…”
Section: Introductionmentioning
confidence: 99%