2018
DOI: 10.1038/s41598-018-24304-3
|View full text |Cite
|
Sign up to set email alerts
|

Spinal cord gray matter segmentation using deep dilated convolutions

Abstract: Gray matter (GM) tissue changes have been associated with a wide range of neurological disorders and were recently found relevant as a biomarker for disability in amyotrophic lateral sclerosis. The ability to automatically segment the GM is, therefore, an important task for modern studies of the spinal cord. In this work, we devise a modern, simple and end-to-end fully-automated human spinal cord gray matter segmentation method using Deep Learning, that works both on in vivo and ex vivo MRI acquisitions. We ev… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
86
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 110 publications
(88 citation statements)
references
References 61 publications
2
86
0
Order By: Relevance
“…A well-known fact regarding the Dice loss is that it usually produces predictions concentrated around the upper and lower bounds of the probability distribution, with very low entropy. As in (Perone & Cohen-Adad, 2018b), we used a high threshold value (0.99) for the Dice predictions to produce a balanced model. We have found, however, that the domain adaptation method also regularizes the network predictions, shifting the Dice probability distribution outside of the probability bounds.…”
Section: Behavior Of Dice Loss and Thresholdingmentioning
confidence: 99%
“…A well-known fact regarding the Dice loss is that it usually produces predictions concentrated around the upper and lower bounds of the probability distribution, with very low entropy. As in (Perone & Cohen-Adad, 2018b), we used a high threshold value (0.99) for the Dice predictions to produce a balanced model. We have found, however, that the domain adaptation method also regularizes the network predictions, shifting the Dice probability distribution outside of the probability bounds.…”
Section: Behavior Of Dice Loss and Thresholdingmentioning
confidence: 99%
“…3.2) for the challenge models. To address the high class imbalance between background, WM and GM, similar to [5] we added a GM Dice loss (DL), but also included DLs for all the other label classes using the generalized Dice loss (GDL) formulation of Sudre et al [8].…”
Section: Methodsmentioning
confidence: 99%
“…This can be explained, because the GM boundary is part of the WM boundaries and thus influences the WM scores, and furthermore the outer WM boundary is already well delineated even without any DL through the good CSF-WM contrast. Choosing a DL as a surrogate for GM DSC only, as proposed in [5], is thus justifiable. While the SCGM challenge results only provide GM segmentation accuracy, for the AMIRA dataset we additionally also provide WM segmentation results.…”
Section: Amira Segmentation Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…As the shape of the data of each layer of a CNN is generally unrestricted, taking the processing stream from one CNN and adding it to the function of another CNN is possible (e.g. [15][16][17]). Based on this idea, this study combines a CNN for segmentation with another simple CNN designed for general non-linear reconstruction so that the finally generated myelin volume index (GenMVI) is more specific to the characteristics of the tissue in each pixel.…”
Section: Introductionmentioning
confidence: 99%