1995
DOI: 10.1109/72.471356
|View full text |Cite
|
Sign up to set email alerts
|

Fault-tolerant training for optimal interpolative nets

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

1995
1995
2013
2013

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 29 publications
(10 citation statements)
references
References 7 publications
(7 reference statements)
0
10
0
Order By: Relevance
“…Further discussion on OI Net can be seen in Defigueiredo (1990), Sin and Defigueiredo (1992), and Simon and El-Sherief (1995b).…”
Section: Neural Network For Approximation and Classificationmentioning
confidence: 99%
“…Further discussion on OI Net can be seen in Defigueiredo (1990), Sin and Defigueiredo (1992), and Simon and El-Sherief (1995b).…”
Section: Neural Network For Approximation and Classificationmentioning
confidence: 99%
“…The efficient recursive learning formulation presented in [20] makes the 01 Net an attractive architecture. In addition, fault tolerance can be implemented in 01 Net training in a straightforward manner [17,18].…”
Section: Neural-network Based Classificationmentioning
confidence: 99%
“…Researchers have employed the conventional BPNN to estimate GDOP; see, for example, [19,20,22]. This can reduce the computational complexity required to compute the matrix inversion in calculating GDOP.…”
Section: Proposed Network Architectures For Wgdop Approximationmentioning
confidence: 99%
“…Simon and El-Sherief [19,20] employed backpropagation neural network (BPNN), a supervised learning neural network [21], to obtain an approximation to the GDOP function. The BPNN was employed to "learn" the relationship between the entries of a measurement matrix and the eigenvalues of its inverse.…”
Section: Introductionmentioning
confidence: 99%