2004
DOI: 10.1016/j.camwa.2003.06.008
|View full text |Cite
|
Sign up to set email alerts
|

An approximation by neural networkswith a fixed weight

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
17
0

Year Published

2013
2013
2024
2024

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 43 publications
(17 citation statements)
references
References 9 publications
0
17
0
Order By: Relevance
“…We investigate the sigmoidal neural network approximation to f (x). This function was also considered in [13]. Note that in [13] the authors chose the sigmoidal function as…”
Section: Numerical Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…We investigate the sigmoidal neural network approximation to f (x). This function was also considered in [13]. Note that in [13] the authors chose the sigmoidal function as…”
Section: Numerical Resultsmentioning
confidence: 99%
“…In many applications, it is convenient to take the activation function σ as a sigmoidal function which is defined as lim t→−∞ σ(t) = 0 and lim t→+∞ σ(t) = 1. The literature on neural networks abounds with the use of such functions and their superpositions (see, e.g., [2,4,6,8,10,11,13,15,20,22,29]). The possibility of approximating a continuous function on a compact subset of the real line or d-dimensional space by SLFNs with a sigmoidal activation function has been well studied in a number of papers.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…Many mathematicians ( [2], [4], [5], [6], [7], [9], [10], [11]) have been studied the neural network approximation in recent years. In [1], Chui, Li and Mhaskar pointed out the limitation of approximation by neural network with one hidden layer.…”
Section: Introductionmentioning
confidence: 99%