2015
DOI: 10.1016/j.neucom.2014.01.065
|View full text |Cite
|
Sign up to set email alerts
|

A fast learning method for feedforward neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
17
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 18 publications
(17 citation statements)
references
References 28 publications
0
17
0
Order By: Relevance
“…Thus, according to the above theoretical analysis, we may realize a LA between the output layer and the last hidden layer, with randomly assigned kernel parameter vectors in every hidden layer. When we understand the behavior of multi-layer FKNN networks in the above way, we can see the same benefits [14,28] as LLM, which is summarized as follows:…”
Section: Remarkmentioning
confidence: 99%
See 3 more Smart Citations
“…Thus, according to the above theoretical analysis, we may realize a LA between the output layer and the last hidden layer, with randomly assigned kernel parameter vectors in every hidden layer. When we understand the behavior of multi-layer FKNN networks in the above way, we can see the same benefits [14,28] as LLM, which is summarized as follows:…”
Section: Remarkmentioning
confidence: 99%
“…KLDA = KPCA + LA. In [14,25,26], Wang et al proved that kernelized SVM and its variants are equivalent to KPCA + SVM or SVM's variants. In the same year, Zhang in [23] gave more general results.…”
Section: Remarkmentioning
confidence: 99%
See 2 more Smart Citations
“…In comparison with the brain, the function of ANNs is extremely simplified; however, their capacity to solve various nonlinear problems is almost known to the researchers working on the modeling of systems with complex nonlinear behavior. Multilayer feed-forward neural networks (FNNs) are a kind of ANN that possess the potentiality to solve complex nonlinear functions and therefore are one of the popular modeling techniques [24][25][26][27][28][29][30][31][32][33].…”
Section: Introductionmentioning
confidence: 99%