2018 4th International Conference on Information Management (ICIM) 2018
DOI: 10.1109/infoman.2018.8392845
|View full text |Cite
|
Sign up to set email alerts
|

On-line voltage stability monitoring using an Ensemble AdaBoost classifier

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2020
2020
2025
2025

Publication Types

Select...
6
2
2

Relationship

0
10

Authors

Journals

citations
Cited by 15 publications
(9 citation statements)
references
References 17 publications
0
9
0
Order By: Relevance
“…The same algorithms are compared in [58], but this time considering also the random forest algorithm. Reference [85] compares decision trees, SVMs, core vector machines and naive Bayes models while [91] compares several ensemble methods (XGBoost, Bagging, Random Forest, and AdaBoost) with Naive Bayes, k Nearest Neighbor (kNN) and decision trees, this time for voltage stability assessment. In [59], the authors propose an automated multi-model approach for online security assessment.…”
Section: Learning a Modelmentioning
confidence: 99%
“…The same algorithms are compared in [58], but this time considering also the random forest algorithm. Reference [85] compares decision trees, SVMs, core vector machines and naive Bayes models while [91] compares several ensemble methods (XGBoost, Bagging, Random Forest, and AdaBoost) with Naive Bayes, k Nearest Neighbor (kNN) and decision trees, this time for voltage stability assessment. In [59], the authors propose an automated multi-model approach for online security assessment.…”
Section: Learning a Modelmentioning
confidence: 99%
“…32 XGBoost is another boosting technique implementing gradient boosted decision trees with advanced speed and performance. 33 Gradient boosting, which is a gradient descent method in function space capable of tting non parametric predictive models, have been empirically demonstrated to be accurate when applied to the tree models. 34 The parameters of AdaBoost and XGBoost were tuned experimentally: the best case was chosen among multiple cases with randomly selected parameters.…”
Section: Algorithmsmentioning
confidence: 99%
“…Adaboost is an iterative ensemble method boosting classifier. Badly classified data take a higher weight than well-classified data (Maaji, Cosma, Taherkhani, Alani, & McGinnity, 2018). NASNet is a scalable CNN architecture that consists of a cell that combines blocks that are optimized using reinforcement learning (Zoph, Vasudevan, Shlens, & Le, 2018).…”
Section: Experimental Settingmentioning
confidence: 99%