2023
DOI: 10.1088/1361-6420/acbdb9
|View full text |Cite
|
Sign up to set email alerts
|

Block delayed Majorize-Minimize subspace algorithm for large scale image restoration *

Abstract: In this work, we propose an asynchronous Majorization-Minimization (MM) algorithm for solving large scale differentiable non-convex optimization problems. The proposed algorithm runs efficient MM memory gradient updates on blocks of coordinates, in a parallel and possibly asynchronous manner. We establish the convergence of the resulting sequence of iterates under mild assumptions. The performance of the algorithm is illustrated on the restoration of 3D images degraded by depth-variant 3D blur, arising in mult… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 75 publications
0
3
0
Order By: Relevance
“…and we have used the fact that D + ϵ 1 I 3 0. Finally, given the definition of Φ in (12), it is clear the lower bound in (36) goes to +∞ when D F → +∞. Therefore, F admits a minimizer.…”
Section: Convergence Analysismentioning
confidence: 95%
See 1 more Smart Citation
“…and we have used the fact that D + ϵ 1 I 3 0. Finally, given the definition of Φ in (12), it is clear the lower bound in (36) goes to +∞ when D F → +∞. Therefore, F admits a minimizer.…”
Section: Convergence Analysismentioning
confidence: 95%
“…To illustrate the performance of our heteroscedastic constrained formulation for the MPM image restoration tast, we first conducted an experiment using simulated data. For this experiment, we chose an image of a fly brain from [12] as the object of interest x. This image, with dimensions M = 128 × 128 × 40, was artificially degraded through a convolution operator H mimicking the effect of a normalized 3D Gaussian kernel with inverse covariance matrix parameterized by the angles θ = 5π/6 rad, φ = 0 rad, and eigenvalues s = (50,50,20).…”
Section: Validation Of the Restoration Methods On Simulated Datamentioning
confidence: 99%
“…When (1.1) is solved using a deterministic scheme, the main advantage of KL condition lies in its ability to promote interesting asymptotic behavior when F is non-necessarily convex. In this context, KL has been used to prove convergence of proximal point algorithms [1,2], of simple splitting algorithms such as the forward-backward algorithm and its variants [2,16,7,24,12,44,8,9], as well as other algorithms based on the majorization-minimization principle [15,13,11]. A natural question is to investigate the transfer of the proof techniques from the deterministic setting, to the stochastic setting, for asymptotic analysis including a.s. convergence of stochastic processes.…”
Section: Introductionmentioning
confidence: 99%