2019
DOI: 10.1016/j.jmva.2018.12.004
|View full text |Cite
|
Sign up to set email alerts
|

PAC-Bayesian risk bounds for group-analysis sparse regression by exponential weighting

Abstract: In this paper, we consider a high-dimensional non-parametric regression model with fixed design and i.i.d. random errors. We propose an estimator by exponential weighted aggregation (EWA) with a group-analysis sparsity promoting prior on the weights. We prove that our estimator satisfies a sharp group-analysis sparse oracle inequality with a small remainder term ensuring its good theoretical performances. We also propose a forward-backward proximal Langevin Monte-Carlo algorithm to sample from the target distr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
6
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
2
1

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(7 citation statements)
references
References 47 publications
1
6
0
Order By: Relevance
“…. This is similar to the scaling that has been provided in the literature for EWA with other group sparsity priors and noises [48,26]. Similar rates were given for θ PEN n with the group Lasso in [40,37,67].…”
Section: Group Lassosupporting
confidence: 78%
See 1 more Smart Citation
“…. This is similar to the scaling that has been provided in the literature for EWA with other group sparsity priors and noises [48,26]. Similar rates were given for θ PEN n with the group Lasso in [40,37,67].…”
Section: Group Lassosupporting
confidence: 78%
“…The estimators θ EWA n and θ PEN n can be easily implemented thanks to the framework of proximal splitting methods, and more precisely forward-backward type splitting. While the latter is well-known to solve (1.1) [64], its application within a proximal Langevin Monte-Carlo algorithm to compute θ EWA n with provable guarantees has been recently developed by the authors in [26] to sample from log-semiconcave densities 2 , see also [25] for log-concave densities.…”
Section: Contributionsmentioning
confidence: 99%
“…• sparse linear regression: see [59] where the authors prove a rate of convergence similar to the one of the LASSO for the Gibbs posterior, under more general assumptions. See also [60,7,57] for many variants and improvements, also [116] for group-sparsity.…”
Section: Lnmentioning
confidence: 99%
“…The optimal rates are derived in [173]. Many aggregates share a formal similarity with the EWA of online learning and with the Gibbs posterior of the PAC-Bayes approach, we refer the reader to [133,97,184,128,185,40,188,109,106,59,38,166,60,55,58,54,57,116,56]. In some of these papers, the connection to PAC-Bayes bounds is explicit: Theorem 1 in [59] is refered to as a PAC-Bayes bound in the paper.…”
Section: Aggregation Of Estimators In Statisticsmentioning
confidence: 99%
“…Aggregation by exponential weighting has been widely considered in the statistical and machine learning literatures, see e.g. (Dalalyan and Tsybakov, 2007, 2008, 2012Nemirovski, 2000;Yang, 2004;Rigollet and Tsybakov, 2007;Lecué, 2007;Guedj and Alquier, 2013;Duy Luu et al, 2016) to name a few.…”
Section: Problem Statementmentioning
confidence: 99%