2019
DOI: 10.1007/978-981-13-6861-5_17
|View full text |Cite
|
Sign up to set email alerts
|

Extra-Tree Classifier with Metaheuristics Approach for Email Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
62
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 131 publications
(73 citation statements)
references
References 11 publications
0
62
0
Order By: Relevance
“…Features were selected based on their relevance to the classification task that this study proposed. This was accomplished using three techniques: Pearson's Correlation, Recursive Feature Elimination (RFE) [23] and Extra Tree Classifier [24], used to estimate feature importance. The common least important features from each method were dropped from both training and testing datasets; Figure 2 illustrates this process.…”
Section: Feature Selectionmentioning
confidence: 99%
“…Features were selected based on their relevance to the classification task that this study proposed. This was accomplished using three techniques: Pearson's Correlation, Recursive Feature Elimination (RFE) [23] and Extra Tree Classifier [24], used to estimate feature importance. The common least important features from each method were dropped from both training and testing datasets; Figure 2 illustrates this process.…”
Section: Feature Selectionmentioning
confidence: 99%
“…We took advantage of several algorithms, with the aim of understanding which ones work best in this specific context. In particular, we evaluated: K-nearest neighbor (KNN) [ 34 ], classification and regression tree (CART) [ 35 ], support vector machine (SVM) [ 36 ], multi-layer perceptron (MLP) [ 37 ], Ada boosting with decision tree (AB) [ 38 ], gradient boosting (GB) [ 39 ], random forest (RF) [ 40 ], and extra tree (ET) [ 41 ].…”
Section: Methodsmentioning
confidence: 99%
“…It handles the data missing values efficiently. [24] ETC Extra trees classifier working is quite similar to the random forest and only different from it in method of construction of trees in the forest. Every decision tree in the extra tree classifier is made from the original training sample.…”
Section: E Evaluation Matricesmentioning
confidence: 99%
“…This motivated our attempts to help healthcare professionals by developing machine learning techniques in the diagnosis of CVD patients' survival. We employed nine machine learning models: Decision Tree (DT) [18], Adaptive Boosting model (AdaBoost) [19], Logistic Regression (LR) [20], Stochastic Gradient Descent (SGD) [21], Random Forest (RF) [22], Gradient Boosting classifier (GBM) [23], Extra Tree Classifier (ETC) [24], Gaussian Naive Bayes (G-NB) [25] and Support Vector Machine (SVM) [26]. Synthetic Minority Oversampling Technique (SMOTE) is applied to handle class-imbalance problem.…”
Section: Introductionmentioning
confidence: 99%