2011
DOI: 10.1007/978-3-642-24043-0_21
|View full text |Cite
|
Sign up to set email alerts
|

Scaled Conjugate Gradient Algorithm in Neural Network Based Approach for Handwritten Text Recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 18 publications
(11 citation statements)
references
References 18 publications
0
11
0
Order By: Relevance
“…The SCG has been applied successfully in many areas such as in inverse natural convention modeling (Wong and Protas, 2013), handwritten text identification (Chel et al, 2011), fault location in high voltage transmission lines (Gayathri and Kumarappan, 2010), solving unconstrained optimization problems (Al-Bayati and Muhammad, 2010) and estimating radar pulse modulation (Lundén and Koivunen, 2007). The basis of the SCG is the gradient descent and the conjugate gradient methods (Hestenes and Stiefel, 1952).…”
Section: Scaled Conjugate Gradient Methodsmentioning
confidence: 99%
“…The SCG has been applied successfully in many areas such as in inverse natural convention modeling (Wong and Protas, 2013), handwritten text identification (Chel et al, 2011), fault location in high voltage transmission lines (Gayathri and Kumarappan, 2010), solving unconstrained optimization problems (Al-Bayati and Muhammad, 2010) and estimating radar pulse modulation (Lundén and Koivunen, 2007). The basis of the SCG is the gradient descent and the conjugate gradient methods (Hestenes and Stiefel, 1952).…”
Section: Scaled Conjugate Gradient Methodsmentioning
confidence: 99%
“…The most important problem with training the ANN using known algorithms (Levenberg-Marquardt [28,29], Bayesian Regularization [30,31], Scaled Conjugate Gradient [32,33]) is that even if the ANN is trained resulting with high correlation and low error between outputs and reference data (divided to training, validation, and testing data), the ANN does not show convergence when working in recurrence during hybrid simulation. When outputs from ANN become its inputs in subsequent time steps, the common problem is that the results become inconsistent with reality either by large error or lack of convergence when u I → ±∞.…”
Section: U(t) + (Cmentioning
confidence: 99%
“…Møller (1993) and Chel et al (2011) suggested the solution for avoiding the time-consuming line search in scaled conjugate gradient method (SCGM) by combining the model-trust region approach with the conjugate-gradient approach (Borkar et al, 2016). Memory requirements for SCGM are similar to Fletcher-Reeves variation of CGM.…”
Section: Scaled Conjugate Gradient Descentmentioning
confidence: 99%
“…SCGM depends on the computation of conjugate directions but avoids time-consuming line search in every iteration. In addition, this method avoids the tedious and memory consuming computation of hessian; done in traditional second order methods (Møller, 1993;Chel et al, 2011).…”
Section: Scaled Conjugate Gradient Descentmentioning
confidence: 99%