2017
DOI: 10.1088/1757-899x/173/1/012008
|View full text |Cite
|
Sign up to set email alerts
|

Feature Selection for Natural Language Call Routing Based on Self-Adaptive Genetic Algorithm

Abstract: Abstract:The text classification problem for natural language call routing was considered in the paper. Seven different term weighting methods were applied. As dimensionality reduction methods, the feature selection based on self-adaptive GA is considered. k-NN, linear SVM and ANN were used as classification algorithms. The tasks of the research are the following: perform research of text classification for natural language call routing with different term weighting methods and classification algorithms and in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 20 publications
0
4
0
Order By: Relevance
“…This section describes the prediction outcome for four machinery datasets. The result tabulation begins with a classifier prediction performance review when subjected to multiclass classification datasets, according to Table 11,12,and 13. In general, the objective of a classifier is to obtain optimal prediction accuracy by minimising undesirable prediction error.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…This section describes the prediction outcome for four machinery datasets. The result tabulation begins with a classifier prediction performance review when subjected to multiclass classification datasets, according to Table 11,12,and 13. In general, the objective of a classifier is to obtain optimal prediction accuracy by minimising undesirable prediction error.…”
Section: Resultsmentioning
confidence: 99%
“…In cross-validation, the loss function kfoldLoss (13) is applied to calibrate the average misclassification proportion of testing-fold target output. The robustness of a cross-validated classifier is inversely proportional to mis kfold (13) The prediction outcome for 10 simulation cycles-including the number of feature subset, computation time, prediction accuracy, confusion matrix statistical analysis, and classification loss for the ten-fold cross-validated classifier-are tabulated in the following section. The simulation is performed using the 2019a version of Matlab software as the Integrated Development Environment (IDE).…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…They found that feature selection with selfadaptive GA provides improvement of classi cation e ectiveness and signi cant dimensionality reduction with all term weighting methods and with all classi cation algorithms [24].…”
Section: Related Workmentioning
confidence: 99%