1995
DOI: 10.1007/bf02367672
|View full text |Cite
|
Sign up to set email alerts
|

Relaxation methods with step regulation for solving constrained optimization problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
1
1

Relationship

2
0

Authors

Journals

citations
Cited by 2 publications
(7 citation statements)
references
References 4 publications
0
7
0
Order By: Relevance
“…where f (x) is a continuously differentiable pseudo-convex function satisfying the socalled Condition A (introduced in [8]) on a convex and closed subset D of Euclidean space R n . For solving this problem, we present a new efficient algorithm, which has the estimates of the rate of its convergence and allows one to adaptively control both the parameter of an ε-normalization of a descent direction and the step length.…”
Section: Definitions and Preliminariesmentioning
confidence: 99%
See 3 more Smart Citations
“…where f (x) is a continuously differentiable pseudo-convex function satisfying the socalled Condition A (introduced in [8]) on a convex and closed subset D of Euclidean space R n . For solving this problem, we present a new efficient algorithm, which has the estimates of the rate of its convergence and allows one to adaptively control both the parameter of an ε-normalization of a descent direction and the step length.…”
Section: Definitions and Preliminariesmentioning
confidence: 99%
“…Condition A describes a sufficiently broad class of functions A(μ, τ (x, y)) . It was shown in [8,14,15] that the class A(μ, x − y 2 ), in particular, is wider than C 1,1 (D)-the well-known class of functions whose gradients satisfy the Lipschitz condition on the convex set D ⊆ R n . By the way, we note that Lipschitzian properties of gradients for this class of functions have been sought as the favorable assumptions in the justification of the theoretical estimates of the convergence rate for the various modern differentiable optimization algorithms.…”
Section: Definition 22 (Condition A)mentioning
confidence: 99%
See 2 more Smart Citations
“…The sublinear convergence rate takes place under the following conditions relating the original problem: 1) an objective function f (x) is pseudo-convex on some convex set D ⊆ R n (the set D may coincide, for instance, with the Lebesgue set (corresponding to a starting point of the iterates sequence) of the objective function or with the whole Euclidean space and etc), 2) the function f (x) is required to be satisfied to so-called Condition A introduced in [13]. We note that this condition will be defined explicitly below in Section 2 (see Definition 2.2).…”
Section: Introductionmentioning
confidence: 99%