2010 International Conference on Intelligent Systems, Modelling and Simulation 2010
DOI: 10.1109/isms.2010.31
|View full text |Cite
|
Sign up to set email alerts
|

Harmony Search Based Supervised Training of Artificial Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
21
0

Year Published

2013
2013
2018
2018

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 57 publications
(21 citation statements)
references
References 19 publications
0
21
0
Order By: Relevance
“…In addition, the proposed MMPSO algorithm is suitable for parallel implementation and the runtime of the MMPSO algorithm can be reduced to a much shorter time with parallel programming. Finally, the performance of the proposed MMPSO algorithm is compared with the performance reported in the literature for the HSA [16], KHA, GA [15], and the fireworks algorithm (FWA) [17] which split the data into 80% training and 20% testing, for six datasets. In order to make this comparison under the same conditions, six datasets are split into 80% training and 20% testing for this experiment.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…In addition, the proposed MMPSO algorithm is suitable for parallel implementation and the runtime of the MMPSO algorithm can be reduced to a much shorter time with parallel programming. Finally, the performance of the proposed MMPSO algorithm is compared with the performance reported in the literature for the HSA [16], KHA, GA [15], and the fireworks algorithm (FWA) [17] which split the data into 80% training and 20% testing, for six datasets. In order to make this comparison under the same conditions, six datasets are split into 80% training and 20% testing for this experiment.…”
Section: Resultsmentioning
confidence: 99%
“…Bolaji et al used the fireworks optimization algorithm (FWA) for ANN training and performed the experimental tests with UCI datasets. The experimental results were compared to the results obtained from the krill herd algorithm (KHA) [15], harmony search algorithm (HSA), and GA [16]. According to the experimental results, the FWA algorithm showed better classification performance [17].…”
Section: Introductionmentioning
confidence: 99%
“…Furthermore, some other evolutionary training algorithms can be integrated into the supervised learning part to train the network [24,25] which would overcome some known shortages of the backpropagation training method. Moreover, the smartphone application could incorporate built-in preprocessing capabilities and the ability to connect to a cloud service that would help the municipals to take proper action to repair roads suffering from hazards as soon as possible.…”
Section: Resultsmentioning
confidence: 99%
“…Similarly, considering the efficiency of HS algorithm-that has a slow convergence rate, but guarantees a near-optimum solution [117]-many researchers applied HS for optimizing weight vector of the FNNs [175,176]. Moreover, the efficiency of HS comes from using m many harmonies (weight vectors), and iteratively improvising each harmony by computing new harmony (new solution vectors) using heuristic inspired by music pitch modification [117,177,178].…”
Section: Weight Optimizationmentioning
confidence: 99%