In the classical ("smooth") mathematical analysis, a differentiable function is studied by means of the derivative (gradient in the multidimensional space). In the case of nondifferentiable functions, the tools of nonsmooth analysis are to be employed. In convex analysis and minimax theory, the corresponding classes of functions are investigated by means of the subdifferential (it is a convex set in the dual space), quasidifferentiable functions are treated via the notion of quasidifferential (which is a pair of sets). To study an arbitrary directionally differentiable function, the notions of upper and lower exhausters (each of them being a family of convex sets) are used. It turns out that conditions for a minimum are described by an upper exhauster, while conditions for a maximum are stated in terms of a lower exhauster. This is why an upper exhauster is called a proper one for the minimization problem (and an adjoint exhauster for the maximization problem) while a lower exhauster will be referred to as a proper one for the maximization problem (and an adjoint exhauster for the minimization problem).The directional derivatives (and hence, exhausters) provide first-order approximations of the increment of the function under study. These approximations are positively homogeneous as functions of direction. They allow one to formulate optimality conditions, to find steepest ascent and descent directions, to construct numerical methods. However, if, for example, the maximizer of the function is to be found, but one has an upper exhauster (which is not proper for the maximization problem), it is required to use a lower exhauster. Instead, one can try to express conditions for a maximum in terms of upper exhauster (which is an adjoint one for the maximization problem). The first to get such conditions was Roshchina. New optimality conditions in terms of adjoint exhausters were recently obtained by Abbasov.
J Optim Theory ApplThe exhauster mappings are, in general, discontinuous in the Hausdorff metric, therefore, computational problems arise. To overcome these difficulties, the notions of upper and lower coexhausters are used. They provide first-order approximations of the increment of the function which are not positively homogeneous any more. These approximations also allow one to formulate optimality conditions, to find ascent and descent directions (but not the steepest ones), to construct numerical methods possessing good convergence properties. Conditions for a minimum are described in terms of an upper coexhauster (which is, therefore, called a proper coexhauster for the minimization problem) while conditions for a maximum are described in terms of a lower coexhauster (which is called a proper one for the maximization problem).In the present paper, we derive optimality conditions in terms of adjoint coexhausters.