2016 IEEE Congress on Evolutionary Computation (CEC) 2016
DOI: 10.1109/cec.2016.7744248
|View full text |Cite
|
Sign up to set email alerts
|

Ensemble classifier selection using multi-objective PSO for fault diagnosis of power transformers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
9
1

Relationship

1
9

Authors

Journals

citations
Cited by 15 publications
(5 citation statements)
references
References 20 publications
0
5
0
Order By: Relevance
“…Over time, oversampling algorithms have become increasingly popular and are now commonly integrated into methods. In 2016, Peimankar et al [35] used a binary variant of the Multi-Objective Particle Swarm Optimization (bi-MOPSO) algorithm and adaptive synthetic (ADASYN) to handle a data imbalance, evaluating the SVM, Fuzzy k-NN, NN, Naive Bayes, and RF models. In 2017, Peimankar et al [36] proposed a two-step algorithm for power transformer fault diagnosis, exploring different architectural approaches and employing ADASYN for data imbalance.…”
Section: Related Workmentioning
confidence: 99%
“…Over time, oversampling algorithms have become increasingly popular and are now commonly integrated into methods. In 2016, Peimankar et al [35] used a binary variant of the Multi-Objective Particle Swarm Optimization (bi-MOPSO) algorithm and adaptive synthetic (ADASYN) to handle a data imbalance, evaluating the SVM, Fuzzy k-NN, NN, Naive Bayes, and RF models. In 2017, Peimankar et al [36] proposed a two-step algorithm for power transformer fault diagnosis, exploring different architectural approaches and employing ADASYN for data imbalance.…”
Section: Related Workmentioning
confidence: 99%
“…EAs for attribute selection vary on the number of objectives to be optimized, integration with other stages, and distribution of base learners. In Peimankar et al (2016Peimankar et al ( , 2017 a multi-objective Particle Swarm Optimization algorithm provided different attribute subsets to heterogeneous base learners, in order to predict whether power transforms will fail in the near future. In Sikdar et al (2016Sikdar et al ( , 2014bSikdar et al ( , 2015, two Pareto-based multi-objective differential evolution algorithms performs attribute selection, and then linear voting weight optimization, in a pipeline fashion (the generation stage is performed before the integration stage).…”
Section: Attribute Selectionmentioning
confidence: 99%
“…The fourth category of ensemble approaches incorporates different strategies for classifier selection. For example in [49] researchers used genetic algorithms to generate an ensemble classifier for unbalanced datasets; in [50], [51] researchers used multi-objective Particle Swarm Optimization (PSO) to generate an ensemble classifier. In [52], [53] researchers used PSO as a model selection tool to select the best set of classifiers to generate an ensemble classifier.…”
Section: Introductionmentioning
confidence: 99%