2011
DOI: 10.1155/2011/107498
|View full text |Cite
|
Sign up to set email alerts
|

A Novel Learning Scheme for Chebyshev Functional Link Neural Networks

Abstract: A hybrid learning scheme (ePSO-BP) to train Chebyshev Functional Link Neural Network (CFLNN) for classification is presented. The proposed method is referred as hybrid CFLNN (HCFLNN). The HCFLNN is a type of feed-forward neural networks which have the ability to transform the nonlinear input space into higher dimensional-space where linear separability is possible. Moreover, the proposed HCFLNN combines the best attribute of particle swarm optimization (PSO), back propagation learning (BP learning), and functi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2012
2012
2023
2023

Publication Types

Select...
3
2
2

Relationship

1
6

Authors

Journals

citations
Cited by 11 publications
(4 citation statements)
references
References 40 publications
0
4
0
Order By: Relevance
“…This section broadly describes the datasets that were used. In terms of the real-world datasets, the basic Iris dataset [38,39] was used (referred to as Dataset 1), where the input featuressepal length, sepal width, petal length, and petal width served as the input to the unsupervised algorithms. The class division served into three main classes-Setosa, Versicolour, Virginica.…”
Section: A Dataset Descriptionmentioning
confidence: 99%
“…This section broadly describes the datasets that were used. In terms of the real-world datasets, the basic Iris dataset [38,39] was used (referred to as Dataset 1), where the input featuressepal length, sepal width, petal length, and petal width served as the input to the unsupervised algorithms. The class division served into three main classes-Setosa, Versicolour, Virginica.…”
Section: A Dataset Descriptionmentioning
confidence: 99%
“…For this type of neural network a online and a off-line training algorithm has been defined with fast convergence properties. Actually, the Orthogonal Activation Function based neural network is a type of feedforward neural networks which have the ability to transform the nonlinear input space into higher dimensionalspace where linear separability is possible [8].…”
Section: Orthogonal Activation Function Based Neural Networkmentioning
confidence: 99%
“…For this type of neural network online and off-line training algorithm has been defined with fast convergence properties. Actually, the OAF-NN is a type of feed-forward neural network having the ability to transform the nonlinear input space into a higher dimensional-space where linear separability is possible [16]. Many useful properties speak up for exploiting the OAF based neural network in system identification • Function approximation capability: Any arbitrary function can be approximated by the network to any desired level of accuracy.…”
Section: Neural Network For Input-output Mapping Of Nonlinear Symentioning
confidence: 99%