2016
DOI: 10.1016/j.cam.2016.01.007
|View full text |Cite
|
Sign up to set email alerts
|

On the regularizing behavior of the SDA and SDC gradient methods in the solution of linear ill-posed problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
10
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
6
2
1

Relationship

3
6

Authors

Journals

citations
Cited by 19 publications
(10 citation statements)
references
References 39 publications
0
10
0
Order By: Relevance
“…is the Yuan steplength [37]. The interest for this steplength is motivated by its spectral properties, which dramatically speed up the convergence [14,18], while showing certain regularization properties useful to deal with linear ill-posed problems [15]. Similar properties hold for the SDA gradient method [16], but for the sake of space we do not show the results of its application in the minimization phase.…”
Section: Identification Phasementioning
confidence: 99%
See 1 more Smart Citation
“…is the Yuan steplength [37]. The interest for this steplength is motivated by its spectral properties, which dramatically speed up the convergence [14,18], while showing certain regularization properties useful to deal with linear ill-posed problems [15]. Similar properties hold for the SDA gradient method [16], but for the sake of space we do not show the results of its application in the minimization phase.…”
Section: Identification Phasementioning
confidence: 99%
“…In the minimization phase, we use the CG method, and, in the strictly convex case, we also use the SDC method proposed in [14]. This provides a way to extend SDC to the costrained case, with the goal of exploiting its smoothing and regularizing effect observed on certain unconstrained ill-posed inverse problems [15]. Of course, the CG solver is still the reference choice in general, especially because it is able to deal with nonconvexity through directions of negative curvature (as done, e.g., in [30]), whereas handling negative curvatures with spectral gradient methods may be a non-trivial task (see, e.g., [10] and the references therein).…”
mentioning
confidence: 99%
“…Many real life applications lead to nonlinear optimization problems whose very large size makes first-order methods the most suitable choice. Among first-order approaches, gradient methods have widely proved their effectiveness in solving challenging unconstrained and constrained problems arising in signal and image processing, compressive sensing, machine learning, optics, chemistry and other areas (see, e.g., [1,2,3,4,5,6,7,8,9,10,11] and the references therein).…”
Section: Introductionmentioning
confidence: 99%
“…Mathematical Problems in Engineering gradient descent-based algorithms have been widely adopted in image processing tasks [18,[32][33][34][35][36][37]. As to TV minimization, gradient projection based PDE methods [1] originally were adopted to solve the associated nonlinear Euler-Lagrange equation.…”
Section: Introductionmentioning
confidence: 99%