2021
DOI: 10.1101/2021.08.03.454798
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

ExhauFS: exhaustive search-based feature selection for classification and survival regression

Abstract: Motivation: Feature selection is one of the main techniques used to prevent overfitting in machine learning applications. The most straightforward approach for feature selection is exhaustive search: one can go over all possible feature combinations and pick up the model with the highest accuracy. This method together with its optimizations were actively used in biomedical research, however, publicly available implementation is missing. Results: We present ExhauFS - the user-friendly command-line implementatio… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 46 publications
0
5
0
Order By: Relevance
“…The ExhauFS tool ( Nersisyan et al, 2021c ) was used to fit Cox survival regression models. For each length of prognostic signature ( k = 1, 2, … , 10), we first selected n most individually predictive features (see the next paragraph for the details) and then fit Cox models for all possible feature subsets.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…The ExhauFS tool ( Nersisyan et al, 2021c ) was used to fit Cox survival regression models. For each length of prognostic signature ( k = 1, 2, … , 10), we first selected n most individually predictive features (see the next paragraph for the details) and then fit Cox models for all possible feature subsets.…”
Section: Methodsmentioning
confidence: 99%
“…The ExhauFS tool (Nersisyan et al, 2021c) was used to fit Cox survival regression models. For each length of prognostic signature (k 1, 2, .…”
Section: Construction Of Prognostic Signaturesmentioning
confidence: 99%
See 1 more Smart Citation
“…Condition 3.2 specified the same input variables X as Condition 3.1; however, to optimize CSF prediction, we filtered the input variables X and proposed an ABSA-CSF model with feature selection technology, specifically the exhaustive search method. This approach can choose the model with optimal accuracy by reviewing all possible feature combinations (Nersisyan et al, 2022). It thus does not ignore feature sets that might display optimal prediction performance.…”
Section: Framework and Datamentioning
confidence: 99%
“…Most machine learning methods need to normalize the data. [25] . Data normalization is a technique for standardizing the range of characteristics without affecting the data's dimension.…”
Section: Data Preprocessingmentioning
confidence: 99%