2016 International Joint Conference on Neural Networks (IJCNN) 2016
DOI: 10.1109/ijcnn.2016.7727191
|View full text |Cite
|
Sign up to set email alerts
|

A firefly algorithm for modular granular neural networks optimization applied to iris recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
13
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(13 citation statements)
references
References 18 publications
0
13
0
Order By: Relevance
“…The proposed method uses modular granular neural networks, this kind of artificial neural network was proposed in [ 7 ] and [ 37 ], and their optimization were performed using, respectively, a hierarchical genetic algorithm and a firefly algorithm. In this work, the optimization is performed using a grey wolf optimizer and a comparison among HGA, FA, and GWO is performed to know which of these techniques is better for MGNN optimization.…”
Section: Proposed Methodsmentioning
confidence: 99%
See 4 more Smart Citations
“…The proposed method uses modular granular neural networks, this kind of artificial neural network was proposed in [ 7 ] and [ 37 ], and their optimization were performed using, respectively, a hierarchical genetic algorithm and a firefly algorithm. In this work, the optimization is performed using a grey wolf optimizer and a comparison among HGA, FA, and GWO is performed to know which of these techniques is better for MGNN optimization.…”
Section: Proposed Methodsmentioning
confidence: 99%
“…One of the most important parameters of the architecture is its learning algorithm, backpropagation algorithms are used in the training phase to perform the learning, and 3 variations of this algorithm can be selected by the proposed optimizer: gradient descent with scaled conjugate gradient (SCG), gradient descent with adaptive learning and momentum (GDX), and gradient descent with adaptive learning (GDA). These 3 algorithms were selected because they have between demonstrated to be the fastest algorithms and with them better performances and results have been obtained [ 6 , 7 , 37 39 ].…”
Section: Proposed Methodsmentioning
confidence: 99%
See 3 more Smart Citations