The high complexity of various inverse problems poses a significant challenge to modelbased reconstruction schemes, which in such situations often reach their limits. At the same time, we witness an exceptional success of data-based methodologies such as deep learning. However, in the context of inverse problems, deep neural networks mostly act as black box routines, used for instance for a somewhat unspecified removal of artifacts in classical image reconstructions. In this paper, we will focus on the severely ill-posed inverse problem of limited angle computed tomography, in which entire boundary sections are not captured in the measurements. We will develop a hybrid reconstruction framework that fuses model-based sparse regularization with data-driven deep learning. Our method is reliable in the sense that we only learn the part that can provably not be handled by model-based methods, while applying the theoretically controllable sparse regularization technique to the remaining parts. Such a decomposition into visible and invisible segments is achieved by means of the shearlet transform that allows to resolve wavefront sets in the phase space. Furthermore, this split enables us to assign the clear task of inferring unknown shearlet coefficients to the neural network and thereby offering an interpretation of its performance in the context of limited angle computed tomography.Our numerical experiments show that our algorithm significantly surpasses both pure model-and more data-based reconstruction methods.
We consider a statistical inverse learning problem, where the task is to estimate a function f based on noisy point evaluations of Af , where A is a linear operator. The function Af is evaluated at i.i.d. random design points u n , n = 1, ..., N generated by an unknown general probability distribution. We consider Tikhonov regularization with general convex and p-homogeneous penalty functionals and derive concentration rates of the regularized solution to the ground truth measured in the symmetric Bregman distance induced by the penalty functional. We derive concrete rates for Besov norm penalties and numerically demonstrate the correspondence with the observed rates in the context of X-ray tomography.
The International Atomic Energy Agency (IAEA) has recently approved passive gamma emission tomography (PGET) as a method for inspecting spent nuclear fuel assemblies (SFAs), an important aspect of international nuclear safeguards which aim at preventing the proliferation of nuclear weapons. The PGET instrument is essentially a single photon emission computed tomography (SPECT) system that allows the reconstruction of axial cross-sections of the emission map of the SFA. The fuel material heavily self-attenuates its gamma-ray emissions, so that correctly accounting for the attenuation is a critical factor in producing accurate images. Due to the nature of the inspections, it is desirable to use as little a priori information as possible about the fuel, including the attenuation map, in the reconstruction process. Current reconstruction methods either do not correct for attenuation, assume a uniform attenuation throughout the fuel assembly, or assume an attenuation map based on an initial filtered back projection (FBP) reconstruction. Here, we propose a method to simultaneously reconstruct the emission and attenuation maps by formulating the reconstruction as a constrained minimization problem with a least squares data fidelity term and regularization terms. Using simulated data, we compare the proposed method to FBP, showing that our approach produces significantly better reconstructions by various numerical metrics and a much better classification of spent, missing, and fresh fuel rods. † Currently with STUK, Finland ‡ Currently with TRIUMF, Canada
Tomographic reconstruction is an ill-posed inverse problem that calls for regularization. One possibility is to require sparsity of the unknown in an orthonormal wavelet basis. This, in turn, can be achieved by variational regularization, where the penalty term is the sum of the absolute values of the wavelet coefficients. The primal-dual fixed point (PDFP) algorithm introduced by Peijun Chen, Jianguo Huang, and Xiaoqun Zhang (Fixed Point Theory and Applications 2016) showed that the minimizer of the variational regularization functional can be computed iteratively using a soft-thresholding operation. Choosing the soft-thresholding parameter µ > 0 is analogous to the notoriously difficult problem of picking the optimal regularization parameter in Tikhonov regularization. Here, a novel automatic method is introduced for choosing µ, based on a control algorithm driving the sparsity of the reconstruction to an a priori known ratio of nonzero versus zero wavelet coefficients in the unknown.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.