2016
DOI: 10.1109/msp.2016.2594277
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Adjoint Computation for Wavelet and Convolution Operators [Lecture Notes]

Abstract: First-order optimization algorithms, often preferred for large problems, require the gradient of the differentiable terms in the objective function. These gradients often involve linear operators and their adjoints, which must be applied rapidly. We consider two example problems and derive methods for quickly evaluating the required adjoint operator. The first example is an image deblurring problem, where we must compute efficiently the adjoint of multi-stage wavelet reconstruction. Our formulation of the adjo… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
3
2
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(3 citation statements)
references
References 6 publications
0
3
0
Order By: Relevance
“…Furthermore, we require the adjoints of W and G for the gradients. We compute W * with the technique from [32], which involves handling the padding of the boundary conditions manually. Regarding the blurring operator, we have G * = G.…”
Section: Sampling the Exact Posteriormentioning
confidence: 99%
“…Furthermore, we require the adjoints of W and G for the gradients. We compute W * with the technique from [32], which involves handling the padding of the boundary conditions manually. Regarding the blurring operator, we have G * = G.…”
Section: Sampling the Exact Posteriormentioning
confidence: 99%
“…This produces a fairly accurate approximation of the adjoint operator and in our implementation is available by using the parameter "adjoint" when calling the function idualtree4 [12]. Further improvement could be obtained by a more detailed consideration of the boundary conditions of the discrete convolution, as explained in [11], but we leave this to future work.…”
Section: Adjoint 4d Dt-cwtmentioning
confidence: 99%
“…In particular, no matrix factorization is required and these algorithms can be implemented using forward-adjoint oracles (FAOs), yielding matrix-free optimization algorithms [31,32]. Many signal processing applications can readily make use of FAOs yielding a substantial decrease of the memory requirements.…”
Section: Introductionmentioning
confidence: 99%