2015 International Conference on Futuristic Trends on Computational Analysis and Knowledge Management (ABLAZE) 2015
DOI: 10.1109/ablaze.2015.7155008
|View full text |Cite
|
Sign up to set email alerts
|

Training feedforward neural networks using hybrid flower pollination-gravitational search algorithm

Abstract: Error minimization using conventional backpropagation algorithm for training feed forward neural network (FNN) suffers from problems like slow convergence and local minima trap. Here in this paper gradient free optimization is used for error minimization to avoid local minima. Hence we introduce a new hybrid algorithm integrating the concepts of physics inspired gravitational search algorithm and biology inspired flower pollination algorithm. Gravitational search algorithm is a novel meta-heuristic optimizatio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
7
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 24 publications
(11 citation statements)
references
References 22 publications
0
7
0
Order By: Relevance
“…QIFPNN, FPNN, BANN, and PSONN were vulnerable to the parameters used, similar to other metaheuristic algorithms, which were also susceptible to their parameters. In this experiment, the population parameter 𝑛 = 30 [20] and the original search space for all dimensions πΏπ‘π‘…π‘’π‘Žπ‘™ = βˆ’80 and π‘ˆπ‘π‘…π‘’π‘Žπ‘™ = 80 were determined. For FPNN, BANN and PSONN, the search space is directly performed against the original search space so that 𝐿𝑏 = πΏπ‘π‘…π‘’π‘Žπ‘™ = βˆ’80 and π‘ˆπ‘ = π‘ˆπ‘π‘…π‘’π‘Žπ‘™ = 80 .…”
Section: Parameter Settingsmentioning
confidence: 99%
“…QIFPNN, FPNN, BANN, and PSONN were vulnerable to the parameters used, similar to other metaheuristic algorithms, which were also susceptible to their parameters. In this experiment, the population parameter 𝑛 = 30 [20] and the original search space for all dimensions πΏπ‘π‘…π‘’π‘Žπ‘™ = βˆ’80 and π‘ˆπ‘π‘…π‘’π‘Žπ‘™ = 80 were determined. For FPNN, BANN and PSONN, the search space is directly performed against the original search space so that 𝐿𝑏 = πΏπ‘π‘…π‘’π‘Žπ‘™ = βˆ’80 and π‘ˆπ‘ = π‘ˆπ‘π‘…π‘’π‘Žπ‘™ = 80 .…”
Section: Parameter Settingsmentioning
confidence: 99%
“…This algorithm, as stimulated from nature, simulates the features of flowering plants and the important aspects that lead to find the global and the local feasible space. As for these features, it has been used to solve the optimal power flow problem [31][32][33][34].…”
Section: Introductionmentioning
confidence: 99%
“…The performance of the hybrid FPA was observed to be significantly superior to the performances of the FPA and DE algorithms individually. In solving numerical problems [44], the performances of feed-forward neural networks (FNNs) optimized by a FP-GSA outperform both the FPA and GSA in terms of classification accuracy. Thus, the FPA could be considered as a potential evolutionary algorithm to obtain better performance in diverse fields.…”
Section: Introductionmentioning
confidence: 99%