2002
DOI: 10.1109/tip.2002.1006401
|View full text |Cite
|
Sign up to set email alerts
|

A joint inter- and intrascale statistical model for Bayesian wavelet based image denoising

Abstract: This paper presents a new wavelet-based image denoising method, which extends a "geometrical" Bayesian framework. The new method combines three criteria for distinguishing supposedly useful coefficients from noise: coefficient magnitudes, their evolution across scales and spatial clustering of large coefficients near image edges. These three criteria are combined in a Bayesian framework. The spatial clustering properties are expressed in a prior model. The statistical properties concerning coefficient magnitud… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

2
164
0
1

Year Published

2004
2004
2017
2017

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 243 publications
(167 citation statements)
references
References 27 publications
(69 reference statements)
2
164
0
1
Order By: Relevance
“…Related priors were used, e.g. in [27,28] where the signal of interest is defined as a noise-free wavelet coefficient component that exceeds the noise standard deviation. Compared to the Bernoulli-Gaussian, this prior models more realistically the subband statistics, but is also more complex and no multicomponent version has been studied yet.…”
Section: The Laplacian Mixture Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…Related priors were used, e.g. in [27,28] where the signal of interest is defined as a noise-free wavelet coefficient component that exceeds the noise standard deviation. Compared to the Bernoulli-Gaussian, this prior models more realistically the subband statistics, but is also more complex and no multicomponent version has been studied yet.…”
Section: The Laplacian Mixture Modelmentioning
confidence: 99%
“…Standard wavelet thresholding [10] treats the coefficients with magnitudes below a certain threshold as "non significant" and sets these to zero; the remaining, "significant" coefficients are kept unmodified (hard-thresholding) or reduced in magnitude (soft-thresholding). Shrinkage estimators can also result from a Bayesian approach [11][12][13][14][15][16][17][18][19][20][21][22][23][24][25][26][27][28][29], which imposes a prior distribution on noise-free data. Common priors for noise-free data include (generalized) Laplacians [11,18,21], alpha-stable models [20], double stochastic (Gaussian scale mixture) models [24,25] and mixtures of two distributions [13][14][15][16][17] where one distribution models the statistics of "significant" coefficients and the other one models the statistics of "insignificant" data.…”
Section: Introductionmentioning
confidence: 99%
“…As a primary low-level image processing procedure, noise removal has been extensively studied and many denoising schemes have been proposed, from the earlier smoothing filters and frequency domain denoising methods [25] to the lately developed wavelet [1][2][3][4][5][6][7][8][9][10], curvelet [11] and ridgelet [12] based methods, sparse representation [13] and K-SVD [14] methods, shape-adaptive transform [15], bilateral filtering [16,17], non-local mean based methods [18,19] and non-local collaborative filtering [20]. With the rapid development of modern digital imaging devices and their increasingly wide applications in our daily life, there are increasing requirements of new denoising algorithms for higher image quality.…”
Section: Introductionmentioning
confidence: 99%
“…Wavelet transform (WT) [24] has proved to be effective in noise removal [1][2][3][4][5][6][7][8][9][10]. It decomposes the input signal into multiple scales, which represent different time-frequency components of the original signal.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation