2014 International Joint Conference on Neural Networks (IJCNN) 2014
DOI: 10.1109/ijcnn.2014.6889671
|View full text |Cite
|
Sign up to set email alerts
|

Universal approximation propriety of Flexible Beta Basis Function Neural Tree

Abstract: Abstract-In this paper, the universal approximation propriety is proved for the Flexible Beta Basis Function Neural Tree (FBBFNT) model. This model is a tree-encoding method for designing Beta basis function neural network. The performance of FBBFNT is evaluated for benchmark problems drawn from time series approximation area and is compared with other methods in the literature.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2015
2015
2017
2017

Publication Types

Select...
3
1
1

Relationship

3
2

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 31 publications
0
3
0
Order By: Relevance
“…A performance analysis of various activation function is available in [47]. Bouaziz et al [48,49] proposed to use beta-basis function at non-leaf nodes of an FNT.…”
Section: Literature Reviewmentioning
confidence: 99%
“…A performance analysis of various activation function is available in [47]. Bouaziz et al [48,49] proposed to use beta-basis function at non-leaf nodes of an FNT.…”
Section: Literature Reviewmentioning
confidence: 99%
“…3). Hence, a beta basis functionwhich has several controlling parameters, such as shape, size, and center-was used at non-leaf nodes of an FNT [273]. It was observed that embedding beta-basis function at FNT nodes has advantages over other two parametric activation function.…”
Section: Combination Of Fnn Components Optimizationmentioning
confidence: 99%
“…It is called the Flexible Beta Basis Function Neural Tree (FBBFNT). The FBBFNT, as a universal approximator [22], has proved its efficiency with benchmark prediction problems. Starting with a random population of feasible ANNs, the FBBFNT model goes through a learning/optimization phase in order to reach the optimal ANN.…”
Section: Pareto Multi-agent Flexible Neural Tree Pma Fntmentioning
confidence: 99%