2014
DOI: 10.1016/j.engappai.2014.03.007
|View full text |Cite
|
Sign up to set email alerts
|

An unsupervised feature selection algorithm based on ant colony optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
128
0
7

Year Published

2017
2017
2021
2021

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 348 publications
(135 citation statements)
references
References 42 publications
0
128
0
7
Order By: Relevance
“…In the case of S-shaped and V-shaped conversion functions, the number of each dimension feature data belonging to the optimal feature subset is counted, as shown in Figure 7. It can be seen from Figure 7 that the number of the 3rd, 13th, 22nd, 23rd, 24th, and 31st dimensions of the optimal feature subsets is the largest, but the final optimal feature subsets {1, 3,10,11,12,13,14,18,19,21,23,24,29,30,31} do not incorporate all of these higherfrequency features (that is, the optimal feature subset is not a simple combination of features with high frequency). The optimal feature subset does not necessarily include features with high frequency, since the feature subset selected by features of high frequency only may not have the best classification effect.…”
Section: V1mentioning
confidence: 99%
See 1 more Smart Citation
“…In the case of S-shaped and V-shaped conversion functions, the number of each dimension feature data belonging to the optimal feature subset is counted, as shown in Figure 7. It can be seen from Figure 7 that the number of the 3rd, 13th, 22nd, 23rd, 24th, and 31st dimensions of the optimal feature subsets is the largest, but the final optimal feature subsets {1, 3,10,11,12,13,14,18,19,21,23,24,29,30,31} do not incorporate all of these higherfrequency features (that is, the optimal feature subset is not a simple combination of features with high frequency). The optimal feature subset does not necessarily include features with high frequency, since the feature subset selected by features of high frequency only may not have the best classification effect.…”
Section: V1mentioning
confidence: 99%
“…Therefore, the methods using this approach are typically fast but they need a threshold as the stopping criterion for feature selection. Several filter-based methods have been proposed in the literature including information gain, 10 gain ratio, 11 term variance, 12 Gini index, 13 Laplacian score, 14 Fisher score, 15 minimal-redundancy-maximal-relevance, 16 random subspace method, 17 relevance-redundancy feature selection, 18 unsupervised feature selection method based on ant colony optimization (UFSACO), 19 relevance-redundancy feature selection based on ant colony optimization (RRFSACO), 20 graph clustering with node centrality for feature selection method (GCNC), 21 and graph clustering based ant colony optimization feature selection method (GCACO). 22 Wrapper-based methods combine feature selection with the design of the classifier and evaluate the feature subsets on the basis of the accuracy of classification.…”
Section: Related Workmentioning
confidence: 99%
“…Thus, the unsupervised feature selection problem becomes even more challenging than the supervised feature selection problem [5].…”
Section: Introductionmentioning
confidence: 99%
“…S. Tabakhi et al presented an unsupervised feature selection method based on ant colony optimization (UFSACO), the method seeks to find the optimal feature subset through several iterations without using any learning algorithms [5]. Pabitra Mitra et al described an unsupervised feature selection algorithm suitable for datasets, large in both dimension and size [6], the method is based on measuring similarity between features.…”
Section: Introductionmentioning
confidence: 99%
“…EAs mimic various social behaviors existing in nature to solve the optimization problems. Some popular EAs include genetic algorithm (GA) [1,2], particle swarm optimization (PSO) [3,4], artificial immune system (AIS) [5], differentiable evolution (DE) [6,7], ant colony optimization (ACO) [8], artificial bee colony (ABC) [9,10], and simulated annealing algorithm (SA) [11].…”
Section: Introductionmentioning
confidence: 99%