2021
DOI: 10.3390/jimaging7030058
|View full text |Cite
|
Sign up to set email alerts
|

Calibration-Less Multi-Coil Compressed Sensing Magnetic Resonance Image Reconstruction Based on OSCAR Regularization

Abstract: Over the last decade, the combination of compressed sensing (CS) with acquisition over multiple receiver coils in magnetic resonance imaging (MRI) has allowed the emergence of faster scans while maintaining a good signal-to-noise ratio (SNR). Self-calibrating techniques, such as ESPiRIT, have become the standard approach to estimating the coil sensitivity maps prior to the reconstruction stage. In this work, we proceed differently and introduce a new calibration-less multi-coil CS reconstruction method. Calibr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 56 publications
0
2
0
Order By: Relevance
“…An estimate of x is obtained by adopting a sparse inducing formulation, reminiscent from the literature on compressive sensing [13,30,31,35,42]. We solve the penalized least squares problem (10) with g = ρ W • 1 , W ∈ R N×N being the orthogonal symlet 2 wavelet transform on 2 resolution levels, and ρ > 0 the associated regularization parameter.…”
Section: Example 1: Reconstruction Of a Geometric Abdomen From Unders...mentioning
confidence: 99%
“…An estimate of x is obtained by adopting a sparse inducing formulation, reminiscent from the literature on compressive sensing [13,30,31,35,42]. We solve the penalized least squares problem (10) with g = ρ W • 1 , W ∈ R N×N being the orthogonal symlet 2 wavelet transform on 2 resolution levels, and ρ > 0 the associated regularization parameter.…”
Section: Example 1: Reconstruction Of a Geometric Abdomen From Unders...mentioning
confidence: 99%
“…First, it was shown in several works that, for some proper choices of parameters γ k 's, SLOPE promotes sparse solutions with some form of "clustering" 2 of the nonzero coefficients, see e.g., [7,20,28,36]. This feature has been exploited in many application domains: portfolio optimization [29,43], genetics [25], magnetic-resonance imaging [15], subspace clustering [35], deep neural networks [45], etc. Moreover, it has been pointed out in a series of works that SLOPE has very good statistical properties: it leads to an improvement of the false detection rate (as compared to LASSO) for moderately-correlated dictionaries [6,24] and is minimax optimal in some asymptotic regimes, see [31,37].…”
mentioning
confidence: 99%