1998
DOI: 10.1137/s1052623494274313
|View full text |Cite
|
Sign up to set email alerts
|

A D.C. Optimization Algorithm for Solving the Trust-Region Subproblem

Abstract: Abstract. This paper is devoted to difference of convex functions (d.c.) optimization: d.c. duality, local and global optimality conditions in d.c. programming, the d.c. algorithm (DCA), and its application to solving the trust-region problem. The DCA is an iterative method that is quite different from well-known related algorithms. Thanks to the particular structure of the trust-region problem, the DCA is very simple (requiring only matrix-vector products) and, in practice, converges to the global solution. T… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
137
0
1

Year Published

2001
2001
2018
2018

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 468 publications
(140 citation statements)
references
References 21 publications
2
137
0
1
Order By: Relevance
“…This confirms the observations of Tao and An (1998) whereby DCA converges to "good" local minima, and often to global minima in practice.…”
Section: Last Vs Stochastic Gradient Descentsupporting
confidence: 87%
See 2 more Smart Citations
“…This confirms the observations of Tao and An (1998) whereby DCA converges to "good" local minima, and often to global minima in practice.…”
Section: Last Vs Stochastic Gradient Descentsupporting
confidence: 87%
“…DCA is an iterative algorithm that consists in solving, at each iteration, the convex optimization problem obtained by linearizing h (i.e., the non convex part of f = g − h) around the current solution. The local convergence of DCA is proven in Theorem 3.7 of Tao and An (1998), and we refer to this paper for further theoretical guarantees on the stability and robustness of the algorithm. Although DCA is only guaranteed to reach a local minima, the authors of Tao and An (1998) state that DCA often converges to a global optimum.…”
Section: Optimizationmentioning
confidence: 99%
See 1 more Smart Citation
“…Since the matrix K of these examples can be diagonalized by Fourier transform, therefore our algorithms can be effectively carried out. We compare our methods with 1 l [14], 12 ll − [3], and 12 ll α − [4]. In order to verify the proposed methods thoroughly, we use 14 test images from [15] including synthesis images and natural images which are shown in Fig.…”
Section: Methodsmentioning
confidence: 99%
“…DCA is appropriate when the objective function can be decoupled into difference of two convex functions, therefore, we employ the DCA to solve our model. Tao and An proposed DCA [11,12] In what follows, we briefly review the SBI for the sake of completeness. The SBI is useful when 1 l minimization problem is following form 1 =||()||() argmin uuEu φ + (10) where E and φ are convex.…”
Section: Related Workmentioning
confidence: 99%