Wiley Encyclopedia of Electrical and Electronics Engineering 1999
DOI: 10.1002/047134608x.w5106
|View full text |Cite
|
Sign up to set email alerts
|

Feedforward Neural Nets

Abstract: The sections in this article are Neural Network Elements and Notation The Literature Roles for Artificial Neural Networks Mathematical Setup for Feedforward Neural Networks Basic Properties of the Representation by Neural Networks The Representational Power of a Single‐Hidden‐Layer Network … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2003
2003
2019
2019

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 21 publications
(3 citation statements)
references
References 40 publications
0
3
0
Order By: Relevance
“…There are several modeling techniques such as MultiDimensional Scaling (MDS) [5], [6], [7], Sammon Mapping [8], Principal Component Analysis PCA [9] or Feed Forward Neural Networks [11]. In this paper the selected one is Self Organizing Maps [15], two reasons support this decision, the possibility of dimension reduction and the possibility of visualization.…”
Section: A Modelization Of the Segregation Phenomenonmentioning
confidence: 99%
“…There are several modeling techniques such as MultiDimensional Scaling (MDS) [5], [6], [7], Sammon Mapping [8], Principal Component Analysis PCA [9] or Feed Forward Neural Networks [11]. In this paper the selected one is Self Organizing Maps [15], two reasons support this decision, the possibility of dimension reduction and the possibility of visualization.…”
Section: A Modelization Of the Segregation Phenomenonmentioning
confidence: 99%
“…are differentiable to arbitrary order through the chain rule of differentiation, which implies that the error is also differentiable to arbitrary order. Hence, it is possible to make a Taylor's series expansion in w for T [36]. The algorithms for minimizing T are assuming that Taylor's series expansion about a point w 0 that is possibly a local minimum.…”
Section: Theorymentioning
confidence: 99%
“…In details the MLP presented under FNN is one of type of NN. The FNN receive input from one side and generates output to another side (Fine, 1999). In between, it is connected with neurons in different layers in one direction.…”
Section: Introductionmentioning
confidence: 99%