2020
DOI: 10.48550/arxiv.2010.15817
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

$σ$-Ridge: group regularized ridge regression via empirical Bayes noise level cross-validation

Nikolaos Ignatiadis,
Panagiotis Lolas

Abstract: Features in predictive models are not exchangeable, yet common supervised models treat them as such. Here we study ridge regression when the analyst can partition the features into K groups based on external side-information. For example, in high-throughput biology, features may represent gene expression, protein abundance or clinical data and so each feature group represents a distinct modality. The analyst's goal is to choose optimal regularization parameters λ = (λ1, . . . , λK ) -one for each group. In thi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 34 publications
0
2
0
Order By: Relevance
“…One of the main assumptions underlying generalised linear models is that all variables are exchangeable. In many high-dimensional settings, however, this assumption is questionable (Ignatiadis and Lolas, 2020). For example, in cancer genomics, variables may be grouped according to some biological function.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…One of the main assumptions underlying generalised linear models is that all variables are exchangeable. In many high-dimensional settings, however, this assumption is questionable (Ignatiadis and Lolas, 2020). For example, in cancer genomics, variables may be grouped according to some biological function.…”
Section: Introductionmentioning
confidence: 99%
“…Their results show that the uncertainty intervals of the hybrid model and full Bayes alternative correspond well, in the sense that the acquired credible intervals are competitive in terms of frequentist coverage probabilities. In light of this, more recently, Ignatiadis and Lolas (2020) proposed to tune one global parameter and to view the local parameters as function of the global parameter in a group-regularised ridge setting (see Section 1.4.2). They proved that this approach matches the best predictive performance among the class of group-regularised ridge optimisers for a family of models including the linear model.…”
mentioning
confidence: 99%