2016
DOI: 10.1016/j.ins.2016.01.090
|View full text |Cite
|
Sign up to set email alerts
|

BNC-PSO: structure learning of Bayesian networks by Particle Swarm Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
34
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
4

Relationship

0
10

Authors

Journals

citations
Cited by 92 publications
(34 citation statements)
references
References 37 publications
0
34
0
Order By: Relevance
“…In ALARM and INSURANCE, the convergence of ABC-BN was better than that of CMABC, which indicated that for the more complex Bayesian network structure, the convergence accuracy of ALARM was significantly improved when there was no significant difference in convergence speed between ALARM and ABC-BN. For a simpler To better illustrate the performance advantages of CMABC-BNL, the algorithm was compared with genetic algorithms (GA) [22], Bayesian network structure learning based on artificial bee colony algorithm (ABC-BN) [28], Bayesian network construction algorithm using PSO (BNC-PSO) [29] and GES [30] under 1000 data sets and 5000 data sets. The population of each algorithm was set to 50, and the maximum number of iterations was set to 100.…”
Section: Simulation Experiments and Results Analysismentioning
confidence: 99%
“…In ALARM and INSURANCE, the convergence of ABC-BN was better than that of CMABC, which indicated that for the more complex Bayesian network structure, the convergence accuracy of ALARM was significantly improved when there was no significant difference in convergence speed between ALARM and ABC-BN. For a simpler To better illustrate the performance advantages of CMABC-BNL, the algorithm was compared with genetic algorithms (GA) [22], Bayesian network structure learning based on artificial bee colony algorithm (ABC-BN) [28], Bayesian network construction algorithm using PSO (BNC-PSO) [29] and GES [30] under 1000 data sets and 5000 data sets. The population of each algorithm was set to 50, and the maximum number of iterations was set to 100.…”
Section: Simulation Experiments and Results Analysismentioning
confidence: 99%
“…This latter allows the demonstration of its relevance in learning the BN structure. Table 7 shows the experimental results generated by the algorithms IK2-BN, NDPSO-BN [ 19 ], ABC-B [ 40 ], BNC-PSO [ 20 ], SCA [ 41 ] and MMHC [ 42 ] for each database (repeated 500 times). This table illustrates the difference between the original network topology and the learned structure.…”
Section: Evaluation and Discussionmentioning
confidence: 99%
“…Later on, optimization starts when particles pass on their next velocity and positions and fitness value is calculated. Fitness function differs depending on the related study (Gheisari and Meybodi, 2016). In Eq.…”
Section: Particle Swarm Optimization Algorithmmentioning
confidence: 99%