2018 IEEE Symposium Series on Computational Intelligence (SSCI) 2018
DOI: 10.1109/ssci.2018.8628897
|View full text |Cite
|
Sign up to set email alerts
|

Performance comparison of feature reduction techniques in-terms of compactness, computation time and accuracy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 15 publications
0
1
0
Order By: Relevance
“…The information-rich features in metabolomics data are sparse; in fact, they make up only a relatively small portion of the total detected features. Besides, a ML model with too many features takes longer to train, is more prone to overfitting, and suffers from a loss of interpretability; therefore, feature selection strategies which reduce dimensionality are crucial to alleviate the dense data problem that is usually the case in untargeted metabolomics. To start the feature selection procedure in this study, first, we applied a k-Nearest-Neighbors imputation method to find the k nearest samples and imputed the missing elements (Python’s scikit-learn package) .…”
Section: Resultsmentioning
confidence: 99%
“…The information-rich features in metabolomics data are sparse; in fact, they make up only a relatively small portion of the total detected features. Besides, a ML model with too many features takes longer to train, is more prone to overfitting, and suffers from a loss of interpretability; therefore, feature selection strategies which reduce dimensionality are crucial to alleviate the dense data problem that is usually the case in untargeted metabolomics. To start the feature selection procedure in this study, first, we applied a k-Nearest-Neighbors imputation method to find the k nearest samples and imputed the missing elements (Python’s scikit-learn package) .…”
Section: Resultsmentioning
confidence: 99%