2009
DOI: 10.1007/s11075-009-9350-8
|View full text |Cite
|
Sign up to set email alerts
|

Accelerated gradient descent methods with line search

Abstract: We introduced an algorithm for unconstrained optimization based on the transformation of the Newton method with the line search into a gradient descent method. Main idea used in the algorithm construction is approximation of the Hessian by an appropriate diagonal matrix. The steplength calculation algorithm is based on the Taylor's development in two successive iterative points and the backtracking line search procedure. The linear convergence of the algorithm is proved for uniformly convex functions and stric… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
94
0

Year Published

2013
2013
2022
2022

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 50 publications
(95 citation statements)
references
References 22 publications
1
94
0
Order By: Relevance
“…There are many schemes with differently defined search direction vectors and step length parameters (see [4,[6][7][8]14,17,18,21,22]). Some of those ideas have directly influenced the development of the method described in this paper.…”
Section: ð1:2þmentioning
confidence: 99%
See 2 more Smart Citations
“…There are many schemes with differently defined search direction vectors and step length parameters (see [4,[6][7][8]14,17,18,21,22]). Some of those ideas have directly influenced the development of the method described in this paper.…”
Section: ð1:2þmentioning
confidence: 99%
“…Some of those ideas have directly influenced the development of the method described in this paper. Certainly, one of the contributions that is of the great importance for this research is the algorithm presented in [21], stated in this paper as SM method. An interesting approach for calculating the accelerated parameter from [21] is applied on iteration defined in Section 3.…”
Section: ð1:2þmentioning
confidence: 99%
See 1 more Smart Citation
“…There are several iterative methods, each defined in a specific way, relevant for this work. Some of them are presented in articles (Andrei, 2006), (Stanimirović et al, 2010), (Petrović et al, 2014), , (Stanimirović et. al.…”
Section: Theoretical Partmentioning
confidence: 99%
“…Speed of convergence and efficiency of gradient descent are improved using various variations like conjugate gradient descent (in which the weights are adjusted in directions conjugate to the gradient in order to get fast convergence) and gradient descent with adaptive (in which the learning rate is adjusted during the training in order to produce an optimum convergence rate and error) line search algorithms (Stanimirovic and Miladinovic 2010).…”
Section: Gradient Descentmentioning
confidence: 99%