1980
DOI: 10.1007/bf01588325
|View full text |Cite
|
Sign up to set email alerts
|

First and second order conditions for a class of nondifferentiable optimization problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

2
54
0

Year Published

1985
1985
2017
2017

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 50 publications
(56 citation statements)
references
References 5 publications
2
54
0
Order By: Relevance
“…These duality results generalize some results of Abrham and Buie [2] for the differentiable case and give a dynamic analogue of certain nondifferentiable fractional programming problems considered by Mond [20]. They also give duality results to the fractional analogues of problems considered by Watson [32], Fletcher and Watson [18], which have not been studied explicitly in the literature.…”
Section: Introductionsupporting
confidence: 79%
See 2 more Smart Citations
“…These duality results generalize some results of Abrham and Buie [2] for the differentiable case and give a dynamic analogue of certain nondifferentiable fractional programming problems considered by Mond [20]. They also give duality results to the fractional analogues of problems considered by Watson [32], Fletcher and Watson [18], which have not been studied explicitly in the literature.…”
Section: Introductionsupporting
confidence: 79%
“…The results of this paper also give duals to the (static) mathematical programming problems of Fletcher and Watson [18] (and some of its variants), which have not been reported in the literature explicitly. As a special case of this, we get the duality results of Mond [2], Mond and Schechter [23] and Watson [32].…”
Section: Introductionmentioning
confidence: 73%
See 1 more Smart Citation
“…As observed by Burke and Ferris in their seminal paper [4], a wide variety of its applications can be found throughout the mathematical programming literature especially in convex inclusion, minimax problems, penalization methods and goal programming, see also [2,6,7,15,22]; the study of (1.1) not only provides a unifying framework for the development and analysis of algorithmic for solutions but also a convenient tool for the study of first-and second-order optimality conditions in constrained optimization [3,5,7,22]. As in [4,13], the study of (1.1) naturally relates to the convex inclusion problem…”
mentioning
confidence: 98%
“…Computation of the secondorder correction term is costly, so we only try d SOC if the step d k has been rejected, x k is already nearly feasible, and v k is small compared with Z k u k . These conditions attempt to identify the occurrence of the Maratos effect and are much simpler than the rules given by Fletcher [19]. Combining this with the ideas of section 4.6, the final if statement of Algorithm 2.1 is more precisely defined by the following rules.…”
Section: Second-order Correctionmentioning
confidence: 99%