2016
DOI: 10.1016/j.asoc.2016.03.006
|View full text |Cite
|
Sign up to set email alerts
|

Evolving flexible beta basis function neural tree using extended genetic programming & Hybrid Artificial Bee Colony

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
18
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
5
1
1

Relationship

3
4

Authors

Journals

citations
Cited by 19 publications
(18 citation statements)
references
References 31 publications
0
18
0
Order By: Relevance
“…To improve classification accuracy of FNT, Yang et al [37] proposed a hybridization of FNT with a further-division-of-partition-space method. In [38], authors illustrated crossover and mutation operators for evolving FNT using GP and optimized the tree parameters using PSO algorithm.…”
Section: Literature Reviewmentioning
confidence: 99%
See 2 more Smart Citations
“…To improve classification accuracy of FNT, Yang et al [37] proposed a hybridization of FNT with a further-division-of-partition-space method. In [38], authors illustrated crossover and mutation operators for evolving FNT using GP and optimized the tree parameters using PSO algorithm.…”
Section: Literature Reviewmentioning
confidence: 99%
“…In crossover operation, randomly selected sub-trees of two parent trees were swapped. The swapping includes the exchange of activation-nodes, weights, and inputs as it is described in [38,64,68].…”
Section: Tree-constructionmentioning
confidence: 99%
See 1 more Smart Citation
“…The FNT topology is adapted in our work using the Beta function as a transfer function (FBBFNT). Although the FBBFNTs could solve complex problems [12], this model suffers from high time-cost. In addition, real problems include a large set of features/inputs which increase the time complexity.…”
Section: Introductionmentioning
confidence: 99%
“…It showed a great performance in several success researches like classification [1], [3] (pattern recognition), prediction [12], [13], [31]- [35]. The network structure evolution and the parameter optimization are the two mainly issues that influence on the BBF neural network's performance.…”
Section: Introductionmentioning
confidence: 99%