2019 IEEE International Conference on Data Science and Advanced Analytics (DSAA) 2019
DOI: 10.1109/dsaa.2019.00032
|View full text |Cite
|
Sign up to set email alerts
|

On the Classification Consistency of High-Dimensional Sparse Neural Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 13 publications
0
2
0
Order By: Relevance
“…In the machine learning framework, we assessed three feature selection methods (logistic regression, LASSO, a neural-network-based approach ENNS 41 ) along with three tree-based classifiers: Decision Tree (DT), Random Forest (RF), and Extreme Gradient Boosting (XGBoost). Our approach limited each feature selection method to choose a maximum of ten features to mitigate the risk of overfitting and, more importantly, to reduce the time and effort required to measure the selected features in the laboratory (see Discussion).…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…In the machine learning framework, we assessed three feature selection methods (logistic regression, LASSO, a neural-network-based approach ENNS 41 ) along with three tree-based classifiers: Decision Tree (DT), Random Forest (RF), and Extreme Gradient Boosting (XGBoost). Our approach limited each feature selection method to choose a maximum of ten features to mitigate the risk of overfitting and, more importantly, to reduce the time and effort required to measure the selected features in the laboratory (see Discussion).…”
Section: Resultsmentioning
confidence: 99%
“…When the number of non-zero features was fewer than “n_features”, all non-zero features were selected. The ENNS algorithm has been implemented previously 41 , see code at https://github.com/KaixuYang/ENNS.…”
Section: Methodsmentioning
confidence: 99%