2020
DOI: 10.1609/aaai.v34i04.6020
|View full text |Cite
|
Sign up to set email alerts
|

Improved PAC-Bayesian Bounds for Linear Regression

Abstract: In this paper, we improve the PAC-Bayesian error bound for linear regression derived in Germain et al. (2016). The improvements are two-fold. First, the proposed error bound is tighter, and converges to the generalization loss with a well-chosen temperature parameter. Second, the error bound also holds for training data that are not independently sampled. In particular, the error bound applies to certain time series generated by well-known classes of dynamical models, such as ARX models.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
7
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(8 citation statements)
references
References 7 publications
1
7
0
Order By: Relevance
“…In particular, we make explicit all terms in the right-hand side. Section 5.1 extends our results to linear regression (which has been studied from the perspective of PAC-Bayes in the literature, most recently by [15]). We also experimentally illustrate the behaviour of our bound.…”
Section: Introductionsupporting
confidence: 52%
“…In particular, we make explicit all terms in the right-hand side. Section 5.1 extends our results to linear regression (which has been studied from the perspective of PAC-Bayes in the literature, most recently by [15]). We also experimentally illustrate the behaviour of our bound.…”
Section: Introductionsupporting
confidence: 52%
“…Proof Theorem III.1. The proof follows the same lines as that of [17,Theorem 2], for the sake of completeness we repeat the basic steps. From Theorem II.1 it follows that (11) holds with probability at least 1 − δ.…”
Section: Systemsmentioning
confidence: 98%
“…In [7] PAC-Bayesian bounds for linear regression with a quadratic loss function was developed, later this bound was improved and extended to non i.i.d. data in [17]. In particular, in [17] the derived PAC-Bayesian error bound for non i.i.d linear regression problem was applied to learning ARX models.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations