2002
DOI: 10.1088/0266-5611/18/5/313
|View full text |Cite
|
Sign up to set email alerts
|

Penalized maximum likelihood image restoration with positivity constraints: multiplicative algorithms

Abstract: In this paper, we propose a general method to devise maximum likelihood penalized (regularized) algorithms with positivity constraints. Moreover, we explain how to obtain ‘product forms’ of these algorithms. The algorithmic method is based on Kuhn–Tucker first-order optimality conditions. Its application domain is not restricted to the cases considered in this paper, but it can be applied to any convex objective function with linear constraints. It is specially adapted to the case of objective functions with a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
108
0

Year Published

2006
2006
2023
2023

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 116 publications
(108 citation statements)
references
References 47 publications
0
108
0
Order By: Relevance
“…However, many problems in applications do not conform to these hypotheses, for example when and are Euclidean spaces and ψ is Boltzmann-Shannon entropy. This type of functions appears in many problems in image and signal processing, in statistics, and in machine learning [2,13,14,21,22,23]. Another difficulty in the implementation of (1.3) is that the operators (prox γ n ϕ ) n∈ are not always easy to evaluate.…”
Section: Introductionmentioning
confidence: 99%
“…However, many problems in applications do not conform to these hypotheses, for example when and are Euclidean spaces and ψ is Boltzmann-Shannon entropy. This type of functions appears in many problems in image and signal processing, in statistics, and in machine learning [2,13,14,21,22,23]. Another difficulty in the implementation of (1.3) is that the operators (prox γ n ϕ ) n∈ are not always easy to evaluate.…”
Section: Introductionmentioning
confidence: 99%
“…The aim of this iterative multiplicative algorithm is to minimize the negative log likelihood of data corrupted by a Gaussian noise process, that is the Euclidean distance between the noisy data and the convolutive model, subject to the non-negativity constraint of the successive estimates. The convergence of this algorithm was analyzed by De Pierro (1987) and in its relaxed form by Lantéri et al (2002).…”
Section: Brief Presentation Of the Image Space Reconstruction (Isra) mentioning
confidence: 99%
“…The main drawback of the iterative algorithm, as well as all other non-regularized deconvolution algorithms, is that the deconvolution problem is an illposed problem (Bertero & Boccacci 1998), and as a consequence the reconstructed image becomes very noisy when the iteration number becomes too large. To avoid this problem, the classical ways of solving this problem are either to regularize explicitly (Lantéri et al 2002) the problem by introducing at some level a smoothness penalty, or to stop the iterative process before noise amplification that corresponds to a smoothing operation.…”
Section: Brief Presentation Of the Image Space Reconstruction (Isra) mentioning
confidence: 99%
“…5.1. About this second step, in a series of recent papers Lanteri et al [Lan01,Lan02] proposed a general approach, denoted as split gradient method (SGM), to the design of iterative methods for the minimization of a wide class of convex (and also non-convex) functionals of the following type:…”
Section: Split Gradient Methods (Sgm)mentioning
confidence: 99%
“…Then, according to the split gradient method (SGM), proposed by Lanteri et al ([Lan01], [Lan02]), we write also the gradient of the regularization functional in a similar way:…”
Section: Regularized Rlm and Osemmentioning
confidence: 99%