2012
DOI: 10.3934/ipi.2012.6.565
|View full text |Cite
|
Sign up to set email alerts
|

Some proximal methods for Poisson intensity CBCT and PET

Abstract: Cone-Beam Computerized Tomography (CBCT) and Positron Emission Tomography (PET) are two complementary medical imaging modalities providing respectively anatomic and metabolic information of a patient. In the context of public health, one must address the problem of dose reduction of the potentially harmful quantities related to each exam protocol : X-rays for CBCT and radiotracer for PET. Two demonstrators based on a technological breakthrough (acquisition devices work in photon-counting mode) have been develo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
24
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 32 publications
(24 citation statements)
references
References 48 publications
0
24
0
Order By: Relevance
“…which can balance first and second order regularization and achieves edge-preserved reconstruction while avoiding the stair-casing artifact. We can solve the TGV regularized PET reconstruction problem by solving problem (8) with the assignment x = (u, w) and n = m + 2, A i = (P i , 0), i ∈ [m], A n−1 = (∇, −I), A n = (0, E)…”
Section: Total Generalized Variationmentioning
confidence: 99%
“…which can balance first and second order regularization and achieves edge-preserved reconstruction while avoiding the stair-casing artifact. We can solve the TGV regularized PET reconstruction problem by solving problem (8) with the assignment x = (u, w) and n = m + 2, A i = (P i , 0), i ∈ [m], A n−1 = (∇, −I), A n = (0, E)…”
Section: Total Generalized Variationmentioning
confidence: 99%
“…This can be done provided that we know the conversion between DN and the photon counts (e.g., from instrument specifications). The algorithm described in Section 4 is adaptable to this stabilized fidelity term through specific proximal operators or gradient descent (Dupé et al 2009;Anthoine et al 2012). Notice, however, that such a stabilization cannot be applied only on the observations as a mere preprocessing of the data, while using afterwards other deconvolution methods assuming additive Gaussian noise corruption.…”
Section: Noise Modelmentioning
confidence: 99%
“…In other words, the VST of the convolution of h by x j in (2) is not equivalent to (and hardly approximable by) the convolution of the variance stabilized vectors, which breaks the applicability of those techniques. As an alternative to using a VST, the true Poisson distribution could also be used to design a specific fidelity term, which is based on the negative log-likelihood of the posterior distribution induced by the observation model, i.e., the KL divergence of this distribution (Fish et al 1995;Anthoine et al 2012;Prato et al 2013). However, it is unclear how to adapt the method proposed by Attouch et al (2010) to the resulting framework.…”
Section: Noise Modelmentioning
confidence: 99%
“…Furthermore, in applications such as astronomy, medicine, and fluorescence microscopy where signals are acquired via photon counting devices, like CMOS and CCD cameras, the number of collected photons is related to some non-additive counting errors resulting in a shot noise [1-3, 17, 18]. The latter is non-additive, signal-dependent and it can be modeled by a Poisson distribution [19][20][21][22][23][24][25][26][27][28][29][30][31][32][33]. In this case, when the noise is assumed to be Poisson distributed, the implicit assumption is that Poisson noise dominates over all other noise kinds.…”
Section: Introductionmentioning
confidence: 99%