2016
DOI: 10.3389/fnins.2016.00543
|View full text |Cite
|
Sign up to set email alerts
|

s-SMOOTH: Sparsity and Smoothness Enhanced EEG Brain Tomography

Abstract: EEG source imaging enables us to reconstruct current density in the brain from the electrical measurements with excellent temporal resolution (~ ms). The corresponding EEG inverse problem is an ill-posed one that has infinitely many solutions. This is due to the fact that the number of EEG sensors is usually much smaller than that of the potential dipole locations, as well as noise contamination in the recorded signals. To obtain a unique solution, regularizations can be incorporated to impose additional const… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
9
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
1
1
1

Relationship

2
6

Authors

Journals

citations
Cited by 15 publications
(9 citation statements)
references
References 59 publications
(101 reference statements)
0
9
0
Order By: Relevance
“…We also include reconstructions obtained by adding standard regularizers. In particular, we consider L2-and T V -regularizers [25,15] where L ∈ R N 2 ×N 2 is the Laplacian operator, ∇ denotes the spatial gradient, and λ > 0 is the regularizer parameter that balances the misfit term and regularization term. One approach to choose λ would be the L-curve method [16].…”
Section: Analysis Of Initialization In This Section We Test the Senmentioning
confidence: 99%
“…We also include reconstructions obtained by adding standard regularizers. In particular, we consider L2-and T V -regularizers [25,15] where L ∈ R N 2 ×N 2 is the Laplacian operator, ∇ denotes the spatial gradient, and λ > 0 is the regularizer parameter that balances the misfit term and regularization term. One approach to choose λ would be the L-curve method [16].…”
Section: Analysis Of Initialization In This Section We Test the Senmentioning
confidence: 99%
“…To further consider the spatial smoothness, a total variation term can be imposed as another penalty term, such as first order total variation (TV) regularization in Ref. [30], [42], fractional order TV in [29], [31], and similar algorithm can be derived under the framework of ADMM, however further investigation with constraints of TV is our future work.…”
Section: Lrr Model With Graph Regularizationmentioning
confidence: 99%
“…By replacing 2 -norm by 1 -norm, minimum current estimate (MCE) [28] is proposed to overcome overestimation of active area sizes incurred by 2norm. Recent development of compressive sensing algorithm proved the p (p ≤ 1) regularization on the original source signal usually provides a set of discrete sources distributed across the cortex due to high coherence of lead field matrix, in order to encourage reconstruction of extended source patches, it has been found that by enforcing sparsity in a transformed domain, e.g., total variation (TV) regularization [29], [30], [22], [31], focal source extents can be better estimated.…”
Section: Introductionmentioning
confidence: 99%
“…However, it causes staircasing artifacts and the loss of image contrasts, latter of which is particularly obvious for piecewise smooth images [32]. To address these issues, some high-order regularizations for the Gaussian denoising problem have been introduced, including non-local total variation [21], TV combined with a fourth-order diffusive term [8], Euler's elastic models [15,49,50,61], a mean curvature model [69], a total generalized variation (TGV) model [4,55], a second-order TGV [26], and TGV together with sparsity and/or shearlets [19,27,40]. Specifically for the Poisson denoising, some high-order methods include PDE based models [62,67], a hybrid regularization combining TV with a fourth-order variation [20], and TGV-based Poisson denoising model [29].…”
mentioning
confidence: 99%