1997
DOI: 10.1162/neco.1997.9.5.1109
|View full text |Cite
|
Sign up to set email alerts
|

The Faulty Behavior of Feedforward Neural Networks with Hard-Limiting Activation Function

Abstract: With the progress in hardware implementation of artificial neural networks, the ability to analyze their faulty behavior has become increasingly important to their diagnosis, repair, reconfiguration, and reliable application. The behavior of feedforward neural networks with hardlimiting activation function under stuck-at faults is studied in this article. It is shown that the stuck-at-M faults have a larger effect on the network's performance than the mixed stuck-at faults, which in turn have a larger effect t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2010
2010
2010
2010

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 8 publications
0
1
0
Order By: Relevance
“…Sensitivity analysis provides ability to analyze faulty behavior of neural hardware [3]. Error/fault in feedforward artificial neural network FFANN is classified as [20] Study of fault tolerance and robustness in neural networks needs following three areas to be explored [18]:…”
Section: Introductionmentioning
confidence: 99%
“…Sensitivity analysis provides ability to analyze faulty behavior of neural hardware [3]. Error/fault in feedforward artificial neural network FFANN is classified as [20] Study of fault tolerance and robustness in neural networks needs following three areas to be explored [18]:…”
Section: Introductionmentioning
confidence: 99%