2018
DOI: 10.1049/iet-gtd.2018.5482
|View full text |Cite
|
Sign up to set email alerts
|

Hybrid feature selection approach for power transformer fault diagnosis based on support vector machine and genetic algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
35
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 83 publications
(40 citation statements)
references
References 27 publications
0
35
0
Order By: Relevance
“…In addition, embedded methods cost more in terms of computation than filter ones. Popular embedded methods are Recursive Feature Elimination for Support Vector Machines (SVM-RFE) [68]- [70] and Feature Selection-Perceptron (FS-P) [71]- [73].…”
Section: ) Embedded Techniquesmentioning
confidence: 99%
See 1 more Smart Citation
“…In addition, embedded methods cost more in terms of computation than filter ones. Popular embedded methods are Recursive Feature Elimination for Support Vector Machines (SVM-RFE) [68]- [70] and Feature Selection-Perceptron (FS-P) [71]- [73].…”
Section: ) Embedded Techniquesmentioning
confidence: 99%
“…a small number of parameters to be tuned that are easy to implement and independent of the gradient of an optimization objective, more and more studies have been focused on utilizing these heuristic algorithms to deal with feature selection problems. Representative heuristic algorithms include genetic algorithms [68], [81], [97] [98], differential evolutional algorithms [99], [100], simulated annealing [14], particle swarm optimization [101]- [103], tabu search [104]- [106],and Fisher score algorithms [87], etc. These methods can generally achieve a good feature subset with a fast speed, making the study of feature selection incorporated with search strategies a new trend.…”
Section: B Feature Selection Based On Heuristic Algorithmsmentioning
confidence: 99%
“…The generalization error indicates the prediction precision of the prediction model for different data sets. The generalization error formula is shown in formulas (20) and (21).…”
Section: ) Generalization Ability Of Predictive Modelmentioning
confidence: 99%
“…Compared with the traditional neural network model, the catenary CPCM status prediction model has a hidden layer of nonlinear transformation, which can handle more complex functional relationships. The global optimization ability of genetic algorithms is better than other evolutionary algorithms [20]- [22]. Therefore, in order to avoid the prediction model falling into local optimum, the global search method of genetic algorithm is used to optimize the catenary CPCM status prediction model to achieve higher prediction accuracy.…”
Section: Introductionmentioning
confidence: 99%
“…Binary particle swarm algorithm is a population-based optimizer similar to GA. BPSO algorithm has the strong global search ability, but it cannot converge to the global optimal position of particles [15]. Moreover, with the iteration of the algorithm, the randomness of BPSO becomes stronger, but it lacks the local search ability in the later period [16], [17]. In the iterative process of the imperialist competitive algorithm, the number of empire is continuously reduced, resulting in a decrease in population diversity, which is unfavorable for solving high-dimensional multi-mode optimization problems, and the algorithm is easy to fall into the local optimal solution [18].…”
Section: Introductionmentioning
confidence: 99%