2014
DOI: 10.1080/00949655.2014.902460
|View full text |Cite
|
Sign up to set email alerts
|

A stepwise regression algorithm for high-dimensional variable selection

Abstract: We propose a new stepwise regression algorithm with a simple stopping rule for the identification of influential predictors and interactions among a huge number of variables in various statistical models. Like conventional stepwise regression, at each forward selection step, a variable is included in the current model if the test statistic of the enlarged model with the predictor against the current model has the minimum p-value among all the candidates and is smaller than a predetermined threshold. Instead of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 25 publications
(10 citation statements)
references
References 13 publications
0
10
0
Order By: Relevance
“…Examples 1 and 3 were adopted from [ 7 ], while Examples 2 and 4 were borrowed from [ 5 , 15 ], respectively. We then generated the responses from the following three models.…”
Section: Simulation Studiesmentioning
confidence: 99%
See 2 more Smart Citations
“…Examples 1 and 3 were adopted from [ 7 ], while Examples 2 and 4 were borrowed from [ 5 , 15 ], respectively. We then generated the responses from the following three models.…”
Section: Simulation Studiesmentioning
confidence: 99%
“…To recap, our proposed method distinguishes from the existing stepwise approaches in high dimensional settings. For example, it improves [ 13 , 14 ] by extending the work to a more broad GLM setting and [ 15 ] by establishing the theoretical properties.…”
Section: Introductionmentioning
confidence: 98%
See 1 more Smart Citation
“…It is a fundamental problem in high-dimensional variable selection. Several methods have been developed to address these issues, such as LASSO 34,35 , stepwise regression [36][37][38] , penalized logistic regression 39 and penalized multiple regression 40 .…”
Section: Introductionmentioning
confidence: 99%
“…A problem when using statistical tests for feature selection is that, due to multiple testing, the test statistics do not have the claimed distribution [69] and the resulting p-values are too small [56,67], leading to a high false discovery rate. Approaches to deal with problem include methods that dynamically adjusting significance levels [76], or methods that directly deal with the problem of sequential testing of stepwise by [158] and [154]. A method for dealing with model misspecification in model selection with information criterion is presented in [98].…”
Section: Statistical Tests and Conditional Independencementioning
confidence: 99%