2018
DOI: 10.1007/s00521-018-3880-8
|View full text |Cite
|
Sign up to set email alerts
|

Feature selection generating directed rough-spanning tree for crime pattern analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
2
1

Relationship

2
5

Authors

Journals

citations
Cited by 12 publications
(4 citation statements)
references
References 31 publications
0
4
0
Order By: Relevance
“…For the evaluation purpose, the authors have assessed the generated clusters representing different crime aspects with respect to the ground truth clusters obtained by domain experts. External cluster evaluation techniques [25] like Purity (Pr), Precision (P), Recall (R), and F-measure (F) and Random Index (RI) have been computed using (3 -7):…”
Section: B Evaluation Resultsmentioning
confidence: 99%
“…For the evaluation purpose, the authors have assessed the generated clusters representing different crime aspects with respect to the ground truth clusters obtained by domain experts. External cluster evaluation techniques [25] like Purity (Pr), Precision (P), Recall (R), and F-measure (F) and Random Index (RI) have been computed using (3 -7):…”
Section: B Evaluation Resultsmentioning
confidence: 99%
“…The comparison is done based on number of features selected and the accuracy of the classifiers used. The feature selection algorithms used are (i) Rough-spanning tree based feature selection algorithm (RMST) [43], (ii) Classification of vocal and non-vocal segments in audio clips using genetic algorithm based feature selection (GAFS) [55] (iii) Relevant feature selection and ensemble classifier design using bi-objective genetic algorithm (RFSA) [56], (iv) Acoustic feature selection for automatic emotion recognition from speech (AFSS) [57], (v) Exploring boundary region of rough set theory for feature selection (RSFS) [52], and (vi) Speech-Based Emotion Recognition: Feature Selection by Self-Adaptive Multi-Criteria Genetic Algorithm (SFGA) [58]. To measure the accuracy of the classifiers based on reduced feature set, we have considered eight different classifiers, namely Support vector machine (SVM), K -nearest neighbors (KNN), Decision tree (DT), Neural network (NN), Random forest (RF), Naïve Bayes (NB), Adaboost (BST), and Sequential minimal optimization (SMO).…”
Section: Evaluation Of Proposed Bffsbr Feature Selection Methodsmentioning
confidence: 99%
“…Feature selection [43] is a process of selecting a subset of the original features, yet produce similar or almost similar analytical results. It tries to select a minimum set of features such that the probability distribution of different classes given the values for those features is as close as possible to the original distribution given the values of all features.…”
Section: Step-wise Floating Forward Selection and Backward Removalmentioning
confidence: 99%
“…Enter the era of powerful computing, and it saw the massive upsurge in the field of Deep Learning approaches, which efficiently tackled unstructured higher dimensional data with ease and provided an astounding accuracy whenever applied to any field of study. As such a plethora of Deep Learning work has summoned the door of dealing with the high dimension of microarray data and derived output classification based on that, but still they bear that ideology of transformation to some lower order variant via previous dimensionality reduction algorithms [3][4][5][6][7], Heatmaps [8], Statistical inferences [9], etc. and doesn't maximize the full utility of neural networks which are some excellent feature extractors.…”
Section: Introductionmentioning
confidence: 99%