1997
DOI: 10.1109/72.641455
|View full text |Cite
|
Sign up to set email alerts
|

Nonlinear backpropagation: doing backpropagation without derivatives of the activation function

Abstract: Abstract-The conventional linear backpropagation algorithm is replaced by a nonlinear version, which avoids the necessity for calculating the derivative of the activation function. This may be exploited in hardware realizations of neural processors. In this paper we derive the nonlinear backpropagation algorithms in the framework of recurrent backpropagation and present some numerical simulations of feedforward networks on the NetTalk problem. A discussion of implementation in analog very large scale integrati… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
13
0

Year Published

1998
1998
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 23 publications
(13 citation statements)
references
References 11 publications
0
13
0
Order By: Relevance
“…The training process of the NICE model can be accomplished by using the back propagation algorithm which mainly includes two stages: incentive propagation and weight update [33]. As shown in Fig.…”
Section: B the Training Process Of The Nicementioning
confidence: 99%
“…The training process of the NICE model can be accomplished by using the back propagation algorithm which mainly includes two stages: incentive propagation and weight update [33]. As shown in Fig.…”
Section: B the Training Process Of The Nicementioning
confidence: 99%
“…Firstly, the loss function of the discriminator is calculated after the samples are input into the discriminator. Then, the parameters of the discriminator are updated by the back propagation algorithm [27]. The loss function of the discriminator is:…”
Section: A Traditional Ganmentioning
confidence: 99%
“…By contrast, analog hardware is ideal for implementing continuous-time dynamics such as those of leaky integrator neurons. Previous work have proposed such implementations [Hertz et al, 1997].…”
Section: Possible Implementation On Analog Hardwarementioning
confidence: 99%
“…Other alternatives to recurrent back-propagation in the framework of fixed point recurrent networks were proposed by O'Reilly [1996] and Hertz et al [1997]. Their algorithms are called 'Generalized Recirculation algorithm' (or 'GeneRec' for short) and 'Non-Linear Back-propagation', respectively.…”
Section: Related Workmentioning
confidence: 99%