2011
DOI: 10.1007/s10107-011-0440-8
|View full text |Cite
|
Sign up to set email alerts
|

Subsmooth semi-infinite and infinite optimization problems

Abstract: We first consider subsmoothness for a function family and provide formulas of the subdifferential of the pointwise supremum of a family of subsmooth functions. Next, we consider subsmooth infinite and semi-infinite optimization problems. In particular, we provide several dual and primal characterizations for a point to be a sharp minimum or a weak sharp minimum for such optimization problems.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2011
2011
2022
2022

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 18 publications
(9 citation statements)
references
References 34 publications
(34 reference statements)
0
9
0
Order By: Relevance
“…The next corollary of Theorem 4.1, inspired by the corresponding result of [30], provides precise calculations of the generalized gradient of the supremum of semismooth functions as a direct consequence of Corollary 4.2 and Corollary 4.3. Subsmooth sets were introduced and comprehensively studied in [1].…”
Section: Corollary 43 (Evaluation Of Generalized Gradients Of Supremmentioning
confidence: 94%
See 2 more Smart Citations
“…The next corollary of Theorem 4.1, inspired by the corresponding result of [30], provides precise calculations of the generalized gradient of the supremum of semismooth functions as a direct consequence of Corollary 4.2 and Corollary 4.3. Subsmooth sets were introduced and comprehensively studied in [1].…”
Section: Corollary 43 (Evaluation Of Generalized Gradients Of Supremmentioning
confidence: 94%
“…; see, e.g., [6,30,31] and the references therein. However, we are not familiar with any results in the literature concerning counterparts of (1.5) with no topological requirements on T .…”
Section: Tε(x)mentioning
confidence: 99%
See 1 more Smart Citation
“…In addition, the condition imposed in the foregoing theorem only needs the function to be lower semi-continuous, a rather weak condition in optimization. Hence, our result is applicable even for the case where the subgradient of f does not exist, while in [2][3][4][5][6], f is required, at least, to be subdifferentiable.…”
Section: Nonconvex Casementioning
confidence: 99%
“…Of particular note in this fields is the paper by Burke and Ferris [1], which gave an extensive exposition of the notation and its impacted on convex programming and convergence analysis. Since then, this notion was extensively studied by many authors, for example, necessary or sufficient conditions of weak sharp minima for nonconvex programming [2,3], and necessary and sufficient conditions of local weak sharp minima for sup-type (or lower-C 1 ) functions [4,5]. Recent development of weak sharp minima and its related to other issues can be found in [5][6][7][8].…”
Section: Introductionmentioning
confidence: 99%