2010
DOI: 10.1111/j.1467-9868.2010.00740.x
|View full text |Cite
|
Sign up to set email alerts
|

Stability Selection

Abstract: Summary. Estimation of structure, such as in variable selection, graphical modelling or cluster analysis, is notoriously difficult, especially for high dimensional data. We introduce stability selection. It is based on subsampling in combination with (high dimensional) selection algorithms. As such, the method is extremely general and has a very wide range of applicability. Stability selection provides finite sample control for some error rates of false discoveries and hence a transparent principle to choose a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

26
2,414
0
6

Year Published

2010
2010
2021
2021

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 1,995 publications
(2,446 citation statements)
references
References 79 publications
26
2,414
0
6
Order By: Relevance
“…The stability selection method (Meinshausen and Bühlmann, 2010) was used to discover the best subset of variables S stable that will have nonzero weight in the model. Let's assume that we have a generic structure estimation algorithm (i.e.…”
Section: Datamentioning
confidence: 99%
“…The stability selection method (Meinshausen and Bühlmann, 2010) was used to discover the best subset of variables S stable that will have nonzero weight in the model. Let's assume that we have a generic structure estimation algorithm (i.e.…”
Section: Datamentioning
confidence: 99%
“…This is still an active field of research in machine learning; e.g., stability selection, which refers to the consistency of features when subsampling the feature space (Meinshausen and Bühlmann 2010), including applications to neuroimaging (Langs et al 2011).…”
Section: Dysbalance Between Number Of Features and Number Of Subjectsmentioning
confidence: 99%
“…Especially when the correlation is weak between genetic variant and brain activity, it is more important to detect reliable associations while reducing the FDR (Grellmann et al, 2015). Stability selection is a recently proposed strategy that can better control Type-1 error rate (Meinshausen and Bü hlmann, 2010;Wang et al, 2014), hence it is adopted in our proposed method. The simulations in the supplementary data also demonstrate that the stability selection can yield better results than cross validation.…”
Section: Potential Limitationsmentioning
confidence: 99%
“…On the other hand, since the imaging and genetic correlation is quite low (Grellmann et al, 2015) the selected parameters vary a lot during repeated trials. As an alternative, we apply a hybrid of Monte Carlo validation and stability selection (Meinshausen and Bü hlmann, 2010) to select the parameter s and correlated features. Specifically, we perform random sampling from the original dataset without replacement for B times with the same portion of observations, leading to training samples X bk ; Y bk ; b ¼ 1 .…”
Section: Parameter Selectionmentioning
confidence: 99%