2023
DOI: 10.3390/math11102264
|View full text |Cite
|
Sign up to set email alerts
|

A Family of Multi-Step Subgradient Minimization Methods

Abstract: For solving non-smooth multidimensional optimization problems, we present a family of relaxation subgradient methods (RSMs) with a built-in algorithm for finding the descent direction that forms an acute angle with all subgradients in the neighborhood of the current minimum. Minimizing the function along the opposite direction (with a minus sign) enables the algorithm to go beyond the neighborhood of the current minimum. The family of algorithms for finding the descent direction is based on solving systems of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 68 publications
(172 reference statements)
0
3
0
Order By: Relevance
“…We implemented and compared quasi-Newtonian BFGS and DFP methods. A onedimensional search procedure with cubic interpolation [41] (exact one-dimensional descent) and a one-dimensional minimization procedure [34] (inexact one-dimensional descent) were used. We used both the classical QNM with the iterations of (1)-(4) (denoted as BFGS and DFP) and the QNM including iterations with additional orthogonalization (116)-(119) in the form of a sequence of iterations (120) (denoted as BFGS_V and DFP_V).…”
Section: Numerical Study Of Ways To Increase the Orthogonality Of Lea...mentioning
confidence: 99%
See 2 more Smart Citations
“…We implemented and compared quasi-Newtonian BFGS and DFP methods. A onedimensional search procedure with cubic interpolation [41] (exact one-dimensional descent) and a one-dimensional minimization procedure [34] (inexact one-dimensional descent) were used. We used both the classical QNM with the iterations of (1)-(4) (denoted as BFGS and DFP) and the QNM including iterations with additional orthogonalization (116)-(119) in the form of a sequence of iterations (120) (denoted as BFGS_V and DFP_V).…”
Section: Numerical Study Of Ways To Increase the Orthogonality Of Lea...mentioning
confidence: 99%
“…In this paper, we consider a method for deriving matrix updating equations in QNMs by forming a quality functional based on learning relations for matrices, followed by obtaining matrix updating equations in the form of a step of the gradient method for minimizing the quality functional. This approach has shown high efficiency in organizing subgradient minimization methods [34,35].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation