2018
DOI: 10.5705/ss.202015.0473
|View full text |Cite
|
Sign up to set email alerts
|

Variable Selection via Partial Correlation

Abstract: Partial correlation based variable selection method was proposed for normal linear regression models by Bühlmann, Kalisch and Maathuis (2010) as a comparable alternative method to regularization methods for variable selection. This paper addresses two important issues related to partial correlation based variable selection method: (a) whether this method is sensitive to normality assumption, and (b) whether this method is valid when the dimension of predictor increases in an exponential rate of the sample size… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
18
0
1

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 11 publications
(19 citation statements)
references
References 19 publications
0
18
0
1
Order By: Relevance
“…The second condition (C2) allows the dimension p to grow at an exponential rate of sample size n, which is a fairly standard assumption in high-dimensional analysis. Many sure screening methods like "SIS", "DC-SIS" and "TPC" have used this assumption (Fan and Lv, 2008;Li et al, 2012Li et al, , 2017. Though the PC-simple algorithm (Bühlmann et al, 2010) assumes a polynomial growth of p n as a function of n, we notice that it can be readily relaxed to an exponential of n level.…”
Section: Theoretical Properties 41 Assumptions and Conditionsmentioning
confidence: 99%
“…The second condition (C2) allows the dimension p to grow at an exponential rate of sample size n, which is a fairly standard assumption in high-dimensional analysis. Many sure screening methods like "SIS", "DC-SIS" and "TPC" have used this assumption (Fan and Lv, 2008;Li et al, 2012Li et al, , 2017. Though the PC-simple algorithm (Bühlmann et al, 2010) assumes a polynomial growth of p n as a function of n, we notice that it can be readily relaxed to an exponential of n level.…”
Section: Theoretical Properties 41 Assumptions and Conditionsmentioning
confidence: 99%
“…For the sake of simplicity, the least squared estimates trueβ^js are computed for the nonzero coefficients, and the nonparametric function is estimated by g^(u)=S(yXboldβ^) with the plug-in trueβ^. We summarize the whole procedure in Algorithm 1, in which we follow [10] to set T(α,n,κ,|scriptS|)=expfalse{21+κΦ1(1α/2)/n|scriptS|1false}1expfalse{21+κΦ1(1α/2)/n|scriptS|1false}+1 and κ^=1ptrue∑j=1p[n1…”
Section: Variable Selection Via Partial Correlations Of Partial Residmentioning
confidence: 99%
“…In this paper, we propose a new variable selection procedure for PLM. This procedure differs from the aforementioned penalized least squares methods in that it is a partial correlation learning procedure based on the notion of partial faithfulness that was first advocated by Buhlmann et al [1] for normal linear models and further used for elliptical linear models in [10]. We first utilize partial residual techniques to eliminate the nonparametric baseline function, and then conduct variable selection by recursively testing the partial correlations between the partial residual of the response and that of the linear covariates.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The extension of SIS is iterative sure independence screening (ISIS), which can revive those non-negligible features that are single uncorrelated while indirectly correlated to the respond variables [ 11 ]. Instead of the Pearson correlation based SIS, statisticians have exploited some other SIS methods from different measurements, such as rank correlation [ 14 ], the distance correlation [ 15 ], the partial correlation [ 16 ] and so on. Among these methods, Pearson correlation and distance correlation based screening have been applied in GWAS successfully [ 6 , 17 ], and some genes associated with crop quantitative traits such as rice salt-tolerance and poplar growth have been identified [ 18 , 19 ].…”
Section: Introductionmentioning
confidence: 99%