Abstract:We introduced an algorithm for unconstrained optimization based on the transformation of the Newton method with the line search into a gradient descent method. Main idea used in the algorithm construction is approximation of the Hessian by an appropriate diagonal matrix. The steplength calculation algorithm is based on the Taylor's development in two successive iterative points and the backtracking line search procedure. The linear convergence of the algorithm is proved for uniformly convex functions and stric… Show more
“…There are many schemes with differently defined search direction vectors and step length parameters (see [4,[6][7][8]14,17,18,21,22]). Some of those ideas have directly influenced the development of the method described in this paper.…”
Section: ð1:2þmentioning
confidence: 99%
“…Some of those ideas have directly influenced the development of the method described in this paper. Certainly, one of the contributions that is of the great importance for this research is the algorithm presented in [21], stated in this paper as SM method. An interesting approach for calculating the accelerated parameter from [21] is applied on iteration defined in Section 3.…”
Section: ð1:2þmentioning
confidence: 99%
“…In [21], the authors named such a type of iterative schemes as accelerated gradient descent methods. Generally, mentioned acceleration parameter improves the behavior of gradient descent algorithms.…”
“…There are many schemes with differently defined search direction vectors and step length parameters (see [4,[6][7][8]14,17,18,21,22]). Some of those ideas have directly influenced the development of the method described in this paper.…”
Section: ð1:2þmentioning
confidence: 99%
“…Some of those ideas have directly influenced the development of the method described in this paper. Certainly, one of the contributions that is of the great importance for this research is the algorithm presented in [21], stated in this paper as SM method. An interesting approach for calculating the accelerated parameter from [21] is applied on iteration defined in Section 3.…”
Section: ð1:2þmentioning
confidence: 99%
“…In [21], the authors named such a type of iterative schemes as accelerated gradient descent methods. Generally, mentioned acceleration parameter improves the behavior of gradient descent algorithms.…”
“…There are several iterative methods, each defined in a specific way, relevant for this work. Some of them are presented in articles (Andrei, 2006), (Stanimirović et al, 2010), (Petrović et al, 2014), , (Stanimirović et. al.…”
Underage costs are not easily quantifiable in spare parts management. These costs occur when a spare part is required and none are available in inventory. This paper provides another approach to underage cost optimization for subassemblies and assemblies in aviation industry. The quantity of spare parts is determined by using a method for airplane spare parts forecasting based on Rayleigh's model. Based on that, the underage cost per unit is determined by using the Newsvendor model. Then, by implementing a transformed accelerated double-step size gradient method, the underage costs for spare sub-assemblies and assemblies in airline industry are optimized.
“…Speed of convergence and efficiency of gradient descent are improved using various variations like conjugate gradient descent (in which the weights are adjusted in directions conjugate to the gradient in order to get fast convergence) and gradient descent with adaptive (in which the learning rate is adjusted during the training in order to produce an optimum convergence rate and error) line search algorithms (Stanimirovic and Miladinovic 2010).…”
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.