2016
DOI: 10.1109/tpami.2015.2441053
|View full text |Cite
|
Sign up to set email alerts
|

Cascades of Regression Tree Fields for Image Restoration

Abstract: Conditional random fields (CRFs) are popular discriminative models for computer vision and have been successfully applied in the domain of image restoration, especially to image denoising. For image deblurring, however, discriminative approaches have been mostly lacking. We posit two reasons for this: First, the blur kernel is often only known at test time, requiring any discriminative approach to cope with considerable variability. Second, given this variability it is quite difficult to construct suitable fea… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
50
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 80 publications
(50 citation statements)
references
References 44 publications
0
50
0
Order By: Relevance
“…The results in Tab. 1 show 7 that we outperform [2,3], and can also compete with the RTF-based cascade model [24] (trained with non-quantized images), whose additional flexibility does not seem pay off here since the image noise is truly Gaussian. The results further show that we can also compete for noise level σ = 15, for which we trained additional models.…”
Section: Methodsmentioning
confidence: 86%
See 2 more Smart Citations
“…The results in Tab. 1 show 7 that we outperform [2,3], and can also compete with the RTF-based cascade model [24] (trained with non-quantized images), whose additional flexibility does not seem pay off here since the image noise is truly Gaussian. The results further show that we can also compete for noise level σ = 15, for which we trained additional models.…”
Section: Methodsmentioning
confidence: 86%
“…The results in Tab. 2 show that a (5-stage) cascade of regression tree fields (RTFs) [24] achieves the best performance (trained with the same data as our models). This is not surprising, since the more flexible RTFs do not make any noise assumption (in contrast to all other approaches in Tab.…”
Section: Methodsmentioning
confidence: 91%
See 1 more Smart Citation
“…For example, some works adjust the trade-off parameters [7,31] or iteratively change the prior terms [49] based on experiences to manually control the optimization process to avoid trivial solutions. Very recently, the learnable strategies [5,17,25,26,36,45,46] have also been introduced to help estimate the sharp image structures. However, both the manually designed tricks and trained networks will break the convergence guarantees of the standard optimization schemes.…”
Section: Related Workmentioning
confidence: 99%
“…3 standard deviations σ ∈ {15, 25, 50} are chosen to measure the performance of 3DCF. Under such conditions, we compare our 3DCF with state-of-the-art DN methods as described in the introductory section 1: BM3D [8], LSSC [23], EPLL [42], opt-MRF [6], CRTF [29], WNNM [12], CSF [30], TRD [7], MLP [4], as well as the NN [5] fusion method. We use the same training data mentioned in [7], i.e.…”
Section: Dnmentioning
confidence: 99%