2018
DOI: 10.1137/17m1162354
|View full text |Cite
|
Sign up to set email alerts
|

Variational Image Regularization with Euler's Elastica Using a Discrete Gradient Scheme

Abstract: This paper concerns an optimization algorithm for unconstrained non-convex problems where the objective function has sparse connections between the unknowns. The algorithm is based on applying a dissipation preserving numerical integrator, the Itoh-Abe discrete gradient scheme, to the gradient flow of an objective function, guaranteeing energy decrease regardless of step size. We introduce the algorithm, prove a convergence rate estimate for non-convex problems with Lipschitz continuous gradients, and show an … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
22
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
3

Relationship

2
6

Authors

Journals

citations
Cited by 27 publications
(23 citation statements)
references
References 35 publications
0
22
0
Order By: Relevance
“…Therefore, slight modifications are required to apply the aforementioned line search methods to the SOR method. To achieve this task, we shall directly use the approach proposed in [17].…”
Section: Adaptive Sor Methodsmentioning
confidence: 99%
“…Therefore, slight modifications are required to apply the aforementioned line search methods to the SOR method. To achieve this task, we shall directly use the approach proposed in [17].…”
Section: Adaptive Sor Methodsmentioning
confidence: 99%
“…Ehrhardt et al [21] provided additional analysis for the methods, including convergence rates for smooth, convex problems and Polyak-Łojasiewicz functions, as well as well-posedness of the implicit equation. Ringholm et al [46] applied the Itoh-Abe discrete gradient method to non-convex image problems with Euler's elastica regularisation.…”
Section: Related Literaturementioning
confidence: 99%
“…These are methods from geometric numerical integration that preserve the aforementioned geometric structures in a general setting. In recent papers [21,26,45,46], optimisation schemes based on discretising gradient flows with discrete gradients have been analysed and implemented for various problems. Favourable properties of the discrete gradient methods include unconditional dissipation, i.e.…”
Section: Introductionmentioning
confidence: 99%
“…To compute the α k j at each coordinate step one can use any suitable root finder, yet to stay in line with the derivative-free nature of Algorithm 1, one may wish to use a solver like the Brent-Dekker algorithm [3]. Also worth noting is that the parallelization procedure used in [22] works for Algorithm 1 as well.…”
Section: Itoh-abe Discrete Riemannian Gradientmentioning
confidence: 99%
“…Discrete gradients were first used in optimization algorithms for image analysis in [11] and [22]. As an example of a manifold-valued imaging problem, consider Total Variation (TV) denoising of manifold valued images [30], where one wishes to minimize, based on generalizations of the L β and L γ norms:…”
Section: Manifold Valued Imagingmentioning
confidence: 99%