2014
DOI: 10.1016/j.neunet.2014.07.015
|View full text |Cite
|
Sign up to set email alerts
|

New approximation method for smooth error backpropagation in a quantron network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2015
2015
2019
2019

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 27 publications
0
2
0
Order By: Relevance
“…The quantron, a structural element designed for artificial neural networks (ANN), has been proposed and analyzed in previous works [1][2][3][4][5]. It has proven capable of solving effectively the benchmark XOR problem [1].…”
Section: Introductionmentioning
confidence: 99%
“…The quantron, a structural element designed for artificial neural networks (ANN), has been proposed and analyzed in previous works [1][2][3][4][5]. It has proven capable of solving effectively the benchmark XOR problem [1].…”
Section: Introductionmentioning
confidence: 99%
“…Training neural networks is by minimization a cost function defined using the output of the network and measurements from the modeled system (Zhang and Suganthan, 2016). Classical training approaches use derivatives of the cost function to update the weights of the neural network (Gori and Tesi, 1992;Montingy, 2014). Unfortunately, classical approaches do not guarantee finding the global minimum of the optimization problem and are often trapped in a local minimum.…”
Section: Introductionmentioning
confidence: 99%