Abstract:ADInternational audienceA proximal linearized algorithm for minimizing difference of two convex functions is proposed. If the sequence generated by the algorithm is bounded it is proved that every cluster point is a critical point of the function under consideration, even if the auxiliary minimizations are performed inexactly at each iteration. Linear convergence of the sequence is established under suitable additional assumptions
“…The development of local search methods in DC programming has attracted less attention. There exist several methods specifically designed for nonsmooth DC programming problems using their explicit DC representations [2,5,17,35]. In addition, a gradient splitting method introduced in [12] can be modified for minimizing DC functions.…”
Section: Introduction a Class Of Functions Represented As A Differenmentioning
The aim of this paper is to introduce a new proximal double bundle method for unconstrained nonsmooth optimization, where the objective function is presented as a difference of two convex (DC) functions. The novelty in our method is a new escape procedure which enables us to guarantee approximate Clarke stationarity for solutions by utilizing the DC components of the objective function. This optimality condition is stronger than the criticality condition typically used in DC programming. Moreover, if a candidate solution is not approximate Clarke stationary, then the escape procedure returns a descent direction. With this escape procedure, we can avoid some shortcomings encountered when criticality is used. The finite termination of the double bundle method to an approximate Clarke stationary point is proved by assuming that the subdifferentials of DC components are polytopes. Finally, some encouraging numerical results are presented.
“…The development of local search methods in DC programming has attracted less attention. There exist several methods specifically designed for nonsmooth DC programming problems using their explicit DC representations [2,5,17,35]. In addition, a gradient splitting method introduced in [12] can be modified for minimizing DC functions.…”
Section: Introduction a Class Of Functions Represented As A Differenmentioning
The aim of this paper is to introduce a new proximal double bundle method for unconstrained nonsmooth optimization, where the objective function is presented as a difference of two convex (DC) functions. The novelty in our method is a new escape procedure which enables us to guarantee approximate Clarke stationarity for solutions by utilizing the DC components of the objective function. This optimality condition is stronger than the criticality condition typically used in DC programming. Moreover, if a candidate solution is not approximate Clarke stationary, then the escape procedure returns a descent direction. With this escape procedure, we can avoid some shortcomings encountered when criticality is used. The finite termination of the double bundle method to an approximate Clarke stationary point is proved by assuming that the subdifferentials of DC components are polytopes. Finally, some encouraging numerical results are presented.
“…Further, Souza, Oliveira, and Soubeyran [15] gave the following convergence theorem for problem (DCP). In this paper, we want to study the split DC program:…”
In this paper, we study the split DC program by using the split proximal linearized algorithm. Further, linear convergence theorem for the proposed algorithm is established under suitable conditions. As applications, we first study the DC program (DCP). Finally, we give numerical results for the proposed convergence results. MSC: 49J50; 49J53; 49M30; 49M37; 90C26
“…So, many researchers focus their attentions on finding points such that ∂ h(x) ∩ ∂ g(x) = / 0, where x is called a critical point of f [14]. For more details about DC functions and DC programming, one refers to [7,8,9,10,11,12,13,14,17,18,19].…”
Section: Introductionmentioning
confidence: 99%
“…In 2016, Souza, Oliveira, and Soubeyran [18] proposed a proximal linearized algorithm to study DC programming. following …”
In this paper, we introduce a forward-backward algorithm to solve the DC programming problem and establish convergence theorems in the framework of finite dimensional real Hilbert spaces. Our results can also be extended to the generalized DC programming.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.