2015
DOI: 10.1016/j.neucom.2015.02.043
|View full text |Cite
|
Sign up to set email alerts
|

Efficient sequential feature selection based on adaptive eigenspace model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 36 publications
(12 citation statements)
references
References 31 publications
0
12
0
Order By: Relevance
“…labeling adaptation and instance adaptation. In addition, some effective methods for feature selection problems have been proposed, such as improved Fisher score algorithm [87] and enhanced bare-bones particle swarm optimization (BPSO) [88]. Moreover, for some specific problems, such as unreliable data [89], [90], incomplete data [91]- [94], text data [95], and costly data [96], researchers have also proposed the corresponding feature selection methods.…”
Section: Variants and Extensions Of Feature Selection A Hybrid mentioning
confidence: 99%
See 1 more Smart Citation
“…labeling adaptation and instance adaptation. In addition, some effective methods for feature selection problems have been proposed, such as improved Fisher score algorithm [87] and enhanced bare-bones particle swarm optimization (BPSO) [88]. Moreover, for some specific problems, such as unreliable data [89], [90], incomplete data [91]- [94], text data [95], and costly data [96], researchers have also proposed the corresponding feature selection methods.…”
Section: Variants and Extensions Of Feature Selection A Hybrid mentioning
confidence: 99%
“…a small number of parameters to be tuned that are easy to implement and independent of the gradient of an optimization objective, more and more studies have been focused on utilizing these heuristic algorithms to deal with feature selection problems. Representative heuristic algorithms include genetic algorithms [68], [81], [97] [98], differential evolutional algorithms [99], [100], simulated annealing [14], particle swarm optimization [101]- [103], tabu search [104]- [106],and Fisher score algorithms [87], etc. These methods can generally achieve a good feature subset with a fast speed, making the study of feature selection incorporated with search strategies a new trend.…”
Section: B Feature Selection Based On Heuristic Algorithmsmentioning
confidence: 99%
“…Starting by measuring performance on the original (unchanged) dataset, it proceeds by measuring classification performance by using classifiers that are induced in the datasets in which a single feature is omitted. Finally, the least significant feature is detected as the one that caused the lowest drop or highest gain in classifier performance [30]. This feature is afterwards omitted from the dataset, and the procedure is recursively repeated until the minimal required number of features remains or a certain stopping criterion is reached.…”
Section: No Function (A)mentioning
confidence: 99%
“…Other abnormalities, such as mastitis, adenopathy, and granuloma, may also be found in breast images [4]. Machine learning (ML) techniques have found its wide applications in many fields such as prediction problems in educational field [5]- [9], bankruptcy prediction [10]- [16], pattern recognition [17]- [28], image editing [29]- [39], feature reduction [40]- [44], fault diagnosis [45]- [50], face recognition and micro-expression recognition [51]- [57], natural language processing [58], [59] and medical diagnosis [60]- [74]. Especially, it has found its great potential in BC diagnosis.…”
Section: Introductionmentioning
confidence: 99%