2011
DOI: 10.1016/j.parco.2011.06.001
|View full text |Cite
|
Sign up to set email alerts
|

A parallel evolving algorithm for flexible neural tree

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2012
2012
2021
2021

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 25 publications
0
5
0
Order By: Relevance
“…To improve the efficiency in terms of computation, Peng et al [42] proposed a parallel evolving algorithm for FNT, where the parallelization took place in both tree-structure and parameter vector populations. In another parallel approach, Wang et al [43] used gene expression programming (GEP) [44] for evolving FNT and used PSO for parameter optimization.…”
Section: Literature Reviewmentioning
confidence: 99%
“…To improve the efficiency in terms of computation, Peng et al [42] proposed a parallel evolving algorithm for FNT, where the parallelization took place in both tree-structure and parameter vector populations. In another parallel approach, Wang et al [43] used gene expression programming (GEP) [44] for evolving FNT and used PSO for parameter optimization.…”
Section: Literature Reviewmentioning
confidence: 99%
“…It was observed that embedding beta-basis function at FNT nodes has advantages over other two parametric activation function. A parallel evolution of FNT using MPI programming and GPU programming respectively were proposed in [274] and in [275].…”
Section: Combination Of Fnn Components Optimizationmentioning
confidence: 99%
“…Flexible neural tree (FNT) [3] is a hierarchical neural network, which is automatically created in order to solve given problem. Its structure is usually determined using some adaptive mechanism and it is intended to adapt to the problem and data under investigation [11,10,4]. Due to this property of the FNTs, it is not necessary to setup some generic static network structure not related to the problem domain beforehand.…”
Section: Flexible Neural Treementioning
confidence: 99%
“…Finding an optimal or near-optimal flexible neural tree can be accomplished by various evolutionary and bio-inspired algorithms [11,10,4]. The general learning procedure for constructing the FNT model can be described in high level as follows [3]:…”
Section: Flexible Neural Treementioning
confidence: 99%