2012
DOI: 10.1137/11085760x
|View full text |Cite
|
Sign up to set email alerts
|

MCMC-Based Image Reconstruction with Uncertainty Quantification

Abstract: Abstract. The connection between Bayesian statistics and the technique of regularization for inverse problems has been given significant attention in recent years. For example, Bayes' Law is frequently used as motivation for variational regularization methods of Tikhonov type. In this setting, the regularization function corresponds to the negative-log of the prior probability density; the fit-to-data function corresponds to the negative-log of the likelihood; and the regularized solution corresponds to the ma… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
152
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
6
2

Relationship

2
6

Authors

Journals

citations
Cited by 83 publications
(154 citation statements)
references
References 24 publications
2
152
0
Order By: Relevance
“…However, for more complicated structures, the problem remains critical, especially when H ΛH and G x cannot be diagonalized in the same basis. Other recently proposed algorithms for sampling Gaussian distributions in high dimension follow a two-step perturbation-optimization approach [24,[29][30][31][32][33], which can be summarized as follows:…”
Section: Sampling From High-dimensional Gaussian Distributionmentioning
confidence: 99%
See 1 more Smart Citation
“…However, for more complicated structures, the problem remains critical, especially when H ΛH and G x cannot be diagonalized in the same basis. Other recently proposed algorithms for sampling Gaussian distributions in high dimension follow a two-step perturbation-optimization approach [24,[29][30][31][32][33], which can be summarized as follows:…”
Section: Sampling From High-dimensional Gaussian Distributionmentioning
confidence: 99%
“…In this second experiment, we consider the observation problem defined in (29), where H corresponds to a spatially invariant blur with periodic boundary conditions and the noise is a two-term mixed Gaussian variable; i.e., for every i ∈ {1, . .…”
Section: Problem Formulationmentioning
confidence: 99%
“…This optimization-based approach for sampling from the posterior density function is implemented in the case of large-scale linear inverse problems in [3]. Also in that paper, the case in which the parameters λ and γ are unknown is treated.…”
Section: P(γ|v λ δ) ∝ P(v|γ λ)P(γ|δ)mentioning
confidence: 99%
“…As is done in [3] and was mentioned above, one can also make the assumption that λ and γ are gamma distributed in the nonlinear case, allowing for γ, λ, and δ to all be sampled from the full posterior p(γ, λ, δ|v) ∝ p(γ|v, λ, δ)p(λ)p(δ). Since RTO requires the Metropolis correction, the resulting sampling scheme has Metropolis-within-Gibbs form.…”
Section: P(γ|v λ δ) ∝ P(v|γ λ)P(γ|δ)mentioning
confidence: 99%
“…For example, the variance is often estimated from repeated measurements or residual analysis [5], or it can be estimated using a hierarchical model [1,6]. In a number of cases, the Bayesian posterior density function is of least squares form as well.…”
Section: Introductionmentioning
confidence: 99%