2013
DOI: 10.1016/j.neucom.2013.01.024
|View full text |Cite
|
Sign up to set email alerts
|

A hybrid learning algorithm for evolving Flexible Beta Basis Function Neural Tree Model

Abstract: Abstract-In this paper, a tree-based encoding method is introduced to represent the Beta basis function neural network. The proposed model called Flexible Beta Basis Function Neural Tree (FBBFNT) can be created and optimized based on the predefined Beta operator sets. A hybrid learning algorithm is used to evolving FBBFNT Model: the structure is developed using the Extended Genetic Programming (EGP) and the Beta parameters and connected weights are optimized by the Opposite-based Particle Swarm Optimization al… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
30
0

Year Published

2013
2013
2021
2021

Publication Types

Select...
4
2
2

Relationship

3
5

Authors

Journals

citations
Cited by 45 publications
(30 citation statements)
references
References 27 publications
0
30
0
Order By: Relevance
“…A performance analysis of various activation function is available in [47]. Bouaziz et al [48,49] proposed to use beta-basis function at non-leaf nodes of an FNT.…”
Section: Literature Reviewmentioning
confidence: 99%
“…A performance analysis of various activation function is available in [47]. Bouaziz et al [48,49] proposed to use beta-basis function at non-leaf nodes of an FNT.…”
Section: Literature Reviewmentioning
confidence: 99%
“…We introduce in this section, the proposed model for design the Beta basis function neural network through some definitions, basic concepts and the corresponding mathematical model. The proposed model is named Flexible Beta Basis Function Neural Tree (FBBFNT) [9], [10], [11], [12].…”
Section: Flexible Beta Basis Function Neural Treementioning
confidence: 99%
“…These reasons encourage us to use the tree-based encoding method for representing a BBF neural network. The new representation called Flexible Beta Basis Function Neural Tree (FBBFNT) [9], [10], [11], [12], is more flexible than the classical BBFNN seen that it can find automatically the number of nodes as well as the number of hidden layers. The FBBFNT is evolved by a hybrid algorithm with two levels: structure evolution and parameter evolution using evolutionary computation [13], [14], [15].…”
Section: Introductionmentioning
confidence: 99%
“…Recently, computational Intelligence techniques such as fuzzy logic [1], artificial neural networks [2,3] and evolutionary algorithms (EAs) [4][5][6] are becoming popular research subjects. They can deal complex problems which are difficult to be solved by classical techniques [7].…”
Section: Introductionmentioning
confidence: 99%
“…Thanks to their ability to find near-optimal solutions without a precise description of the problem, many intelligent optimization techniques have been employed to generate fuzzy models from numerical data and to tune the structure and the rules' parameter of the fuzzy systems. Among these intelligent techniques we can find clustering [19], artificial neural network [2,3], evolutionary computation [20], and so on. In this context, this paper discusses a new approach of fuzzy model identification problem making use of subtractive clustering, Adaptive Neuro-Fuzzy Inference System (ANFIS) and Particle Swarm Optimization (PSO) algorithm.…”
Section: Introductionmentioning
confidence: 99%