2012
DOI: 10.1007/978-3-642-35740-4_5
|View full text |Cite
|
Sign up to set email alerts
|

Polyakov Action Minimization for Efficient Color Image Processing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2013
2013
2023
2023

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 18 publications
(15 citation statements)
references
References 29 publications
0
15
0
Order By: Relevance
“…To overcome this issue, we use the iterative re-weighted least squares (IRLS) technique, which has been successfully used in the context of Beltrami in [36]. IRLS iteratively minimizes the square root term c .…”
Section: Algorithm 1 Lagrangian Methods For Minimization Of Hac Methodsmentioning
confidence: 99%
“…To overcome this issue, we use the iterative re-weighted least squares (IRLS) technique, which has been successfully used in the context of Beltrami in [36]. IRLS iteratively minimizes the square root term c .…”
Section: Algorithm 1 Lagrangian Methods For Minimization Of Hac Methodsmentioning
confidence: 99%
“…Some of these techniques make use of the structure tensor of the image (see for instance [40], [42], [44], [46] for anisotropic diffusion and [33], [38] for denoising). In particular, the approaches in [38], [42] are based on a generalization of the gradient operator to Riemannian manifolds. Let us also cite the recent work of Lenzen et al [32] who introduced spatial adaptivity in the TGV for denoising purposes.…”
Section: A Short Overview On Variational Methods For Image Regularizamentioning
confidence: 99%
“…where the regularizing term is the L 1 norm of the gradient associated to this geometric triplet is the denoising model of Rosman et al [38].…”
Section: Geometric Triplets That Lead To Existing Regularization Methmentioning
confidence: 99%
“…Since then, this model has been extended in several ways (see e.g. [4], [10], [15], [16], [17], [18], [24] for local methods based on a modification of the regularizing term, and [9], [11] for nonlocal methods). In this paper, we construct a new regularizing term by the introduction of a generalization of the gradient operator.…”
Section: Introductionmentioning
confidence: 99%
“…The ROF denoising model based of this new gradient operator generalizes the Euclidean approach of [19] and its multidimensional extension [4], as well as the Riemannian ROF denoising model in [18]. The key idea is to treat the term ∇I as a vector-valued differential 1-form ∇ E I, that we call connection gradient of I, where the operator ∇ E is a covariant derivative (also called connection).…”
Section: Introductionmentioning
confidence: 99%