2009 16th IEEE International Conference on Image Processing (ICIP) 2009
DOI: 10.1109/icip.2009.5413808
|View full text |Cite
|
Sign up to set email alerts
|

PCA Gaussianization for image processing

Abstract: The estimation of high-dimensional probability density functions (PDFs) is not an easy task for many image processing applications. The linear models assumed by widely used transforms are often quite restrictive to describe the PDF of natural images. In fact, additional non-linear processing is needed to overcome the limitations of the model. On the contrary, the class of techniques collectively known as projection pursuit, which solve the high-dimensional problem by sequential univariate solutions, may be app… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2010
2010
2022
2022

Publication Types

Select...
3
2
2

Relationship

1
6

Authors

Journals

citations
Cited by 9 publications
(10 citation statements)
references
References 12 publications
0
10
0
Order By: Relevance
“…120000 pairs of coefficients were used in each MI estimation. Two kinds of MI estimators were used: (1) direct computation of MI, which involves 2D histogram estimation [43], and (2) estimation of MI by PCA-based Gaussianization (GPCA) [44], which only involves univariate histogram estimations. Table 2 shows the MI results (in bits) for pairs of coefficients in the wavelet and the divisive normalized domains.…”
Section: D Statistical Effect Of the Divisive Normalizationmentioning
confidence: 99%
“…120000 pairs of coefficients were used in each MI estimation. Two kinds of MI estimators were used: (1) direct computation of MI, which involves 2D histogram estimation [43], and (2) estimation of MI by PCA-based Gaussianization (GPCA) [44], which only involves univariate histogram estimations. Table 2 shows the MI results (in bits) for pairs of coefficients in the wavelet and the divisive normalized domains.…”
Section: D Statistical Effect Of the Divisive Normalizationmentioning
confidence: 99%
“…In the utilized thresholding function, which is based on maximu m a posteriori (MAP) estimate, the modified version of dominant coefficients was estimated by optimal linear interpolation between each coefficient and the mean value of the corresponding sub band [31]. In this wo rk, authors proposed a fast alternative to iterative Gaussianization methods that makes it suitable for image processing while ensuring its theoretical convergence [32]. Author performed a co mparison between two source signal extract ion algorithms, namely the Wavelet De-noising (WD) by Soft Thresholding and Independent Co mponent Analysis (ICA) on a simu lated functional optical imaging data.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Instead of estimating the biases of the clusters jointly 3 we follow a greedy approach. First, estimate the bias b j of each cluster C j independently, as the minimizer of the norm of the mean of the Gaussianized data 4 ,…”
Section: Radial Gaussianization After Bias Compensationmentioning
confidence: 99%
“…3 Joint optimization is a slow process when the optimization function is not given in closed form or is not differentiable. 4 We also experimented with an alternative where the mean of each cluster C j was used as an estimate of its bias. Our experiments showed that this did not improve RG due to the fact that the clusters were not estimated perfectly.…”
Section: Radial Gaussianization After Bias Compensationmentioning
confidence: 99%
See 1 more Smart Citation